SingleStore is introducing a “Data Intensity Index” to help companies understand how data-intensive their RDBMS applications are and how SingleStore’s database can cope with high data intensity levels.
The company provides a relational SQL database that can handle transaction and analytical workloads, and can run on-premises or in the public cloud. Akamai, Comcast, Royal Bank of Canada, and Uber are customers. When a customer requests a ride during surge periods, such as New Year’s Eve, Uber presents a real-time surge price within milliseconds using SingleStore.
A blog by SingleStore principal solutions engineer Ian Gershuny discusses the firm’s data intensity concept and says: “The truth is, a lot of your existing applications should be data intensive,” but they are not because “we limit our designs around bottlenecks. We don’t bring in too much data so our load process doesn’t choke. We report on stale data since our processes are batched to our analytics and reporting databases. We limit access so as not to overload the database.”
Gershuny says: “Even if you’re able to work around these limitations, our need for data-intensive applications is in our future.” An example is delivery trucking: “Five years ago it would have been inconceivable to track a UPS truck driving through a neighborhood. Now, I watch it right on my phone.”
This data intensity concept is a neat marketing way to encapsulate SingleStore’s message; its combined transaction and analytics architecture can cope with more data intensity than separate databases. The Data Intensity Index is a SingleStore notion for an index value based on five variables: data size for an application, ingest speed, query complexity, query latency (completion time), and concurrency (number of users).
Companies can enter their application’s values for the variables into a SingleStore website page by answering 10 questions:
- Dataset size (1TB, 1-10TB, 10-50TB, 50-100TB, >100TB)
- Data growth rate (<10%, 10-30%, 30-60%, 60-100%, >100%)
- Data ingest speed in rows/sec (<1k, 1k-10k, 10k-100k, 100k-1m, >1m)
- Query complexity in joins (1-2, 3-5, >6)
- Query completion time need (Minutes, 1 – 10 secs, 100ms – 1sec, 10 – 100ms, 0 – 10ms)
- Number of users (<100, <1,000, <10,000, <100,000, >100,000)
The assessment is reckoned to take three minutes, and results in an index score between zero and 100. The higher the score the more data-intensive your application, but the score boundary between a high-intensity app and not-so-high-intensity ones or even low-intensity ones has not been provided by SingleStore. It’s all relative.
The report also provides an assessment of what kind of data infrastructure SingleStore reckons an application will need to deliver the best user experience.
Read Gershuny’s blog for a look at SingleStore’s concept and see if it resonates with you. A downloadable white paper is available here. Note that the assessment is not live yet, and will be available on May 3.