Metadata-Version: 2.1
Name: bc-analyzer
Version: 0.2.1
Summary: A package to create a BehaviorCloud compatible data analyzer.
Home-page: https://github.com/behaviorcloud/bc-analyzer-py
License: MIT
Keywords: behaviorcloud,behavior,cloud,analyzer,analyze,analysis,data,daemon,api
Author: Christian Lent
Author-email: christian@behaviorcloud.com
Requires-Python: >=3.6,<4.0
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Topic :: Software Development :: Build Tools
Requires-Dist: black (>=19.10b0,<20.0)
Requires-Dist: deepdiff (>=4.3.2,<5.0.0)
Requires-Dist: flake8 (>=3.8.1,<4.0.0)
Requires-Dist: python-dateutil (>=2.8.1,<3.0.0)
Requires-Dist: requests (==2.23.0)
Requires-Dist: sentry-sdk (>=0.14.4,<0.15.0)
Project-URL: Repository, https://github.com/behaviorcloud/bc-analyzer-py
Description-Content-Type: text/markdown

# BehaviorCloud Analyzer
#### Python Helper

[![N|Solid](https://behaviorcloud.com/images/section/logo-4ffacbfa.png)](https://behaviorcloud.com/images/section/logo-4ffacbfa.png)

This is a thin package to help you quickly create a BehaviorCloud analyzer. It will handle calls to the BehaviorCloud API server so you don't have to worry about it. This package supports the following features: 
  - Fully managed analyzer creation - you just make the conversion function!
  - One off analysis (pass an "--id" command-line parameter)
  - Daemon analysis (use the "--daemon" flag)
  - Exception handling and automatic upload to Sentry.io

Example Usage:
```python
import json

from behaviorcloud.analyzer.data import convert_stream_to_json
from behaviorcloud.analyzer.coordinator import Coordinator

def convert(source_request, source_settings, settings, source, targets):
    target = targets[0]
    source_data = convert_stream_to_json(source_request)
    analyzed_data = [{processed: True, original: entry} for entry in source_data]
    return [{
        "data": json.dumps(analyzed_data),
        "extension": "json",
        "id": target["id"],
    }]

coordinator = Coordinator(convert)
coordinator.run()
```

