Scalability, compression, encryption, high availability, and complex event processing?
Disclaimer: I'm not a real developer, so I've undoubtedly missed major points, overlooked other projects, and mangled terminology, to say the least. Anyway...
Given enough physical resources, what are the fundamental or structural limitations to using Graphite to do the following:
- Scale up to handle many high-frequency data streams, e.g., 100+ megabytes per second (sustained), and correspondingly, very large databases, not necessarily of fixed size. Examples include real-time sensors and financial market data.
- Compress and/or encrypt everything transparently, including incoming, stored, and webapp-selected data. Of course, this does depend to a large extent of the capabilities of the database.
- Ensure high availability, by guarding distributively against the loss of incoming streams and data already received.
- Process events and trigger actions in [near] real-time, especially with the ability to add/drop/tweak easily scripted, live parallel/serial "decision rules" on the fly. Here, ERMA (http://
These possibilities may depart significantly from the original "niche application". However, I thought I'd ask, since there doesn't appear to be a powerful, yet simple, light-weight, modular, and extensible open source framework to accomplish these tasks, and Graphite is quite appealing.
Question information
- Language:
- English Edit question
- Status:
- Solved
- For:
- Graphite Edit question
- Assignee:
- No assignee Edit question
- Solved by:
- Pat LeSmithe
- Solved:
- Last query:
- Last reply: