Cinchapi Data Platform

The Cinchapi Data Platform (CDP) was designed to make working with real-time data intuitive and efficient.

The Cinchapi Data Platform

The Cinchapi Data Platform (CDP) combines machine learning with human intelligence to make working with data more efficient and intuitive. By imposing no schema of its own, the CDP was purpose-built to work with any high-velocity or static data source. This makes it ideal for data analysts and scientists looking for real-time data analytics and insights derived from disparate or decentralized data sources.

The CDP’s machine learning radically reduces the tedious and time-consuming data prep and cleanup tasks. Meanwhile, its also working to expose interesting or obscured patterns, anomalies, and relationships which warrant a closer look by a data professional.

Once those are found, the platform can then act like a “DVR for Data” allowing users to rewind time in order to understand how these were created, and how they evolved over time. The analytics engine doesn’t just spit out a glorified spreadsheet – instead, it powers rich visualizations and descriptive text to make clear what has happened and what is happening now – in real-time.

The CDP features a context-aware, natural language interface which gets smarter with use. It can understand words, phrases, and acronyms like company jargon, industry terms, abbreviations and more.

Say goodbye to the “I Tarzan. You Jane” NLP interfaces you may have used in the past. Instead of stilted queries like “Sales Cleveland 30-day report”, users can use conversational phrases like “Show me our sales in Cleveland”.

Since the platform’s natural language interface is context-aware, drilling down into data means asking a few simple follow-up questions. Finally a platform that adapts to the way humans work, not the other way around.

The platform provides an intuitive, three-step #AskSeeAct workflow. First, ask context-aware, conversational questions. Next, see real-time results of these questions displayed visually alongside clear and concise descriptive text. The third step is to act on those results.

As a pluggable platform, the CDP’s capability to act is virtually boundless. By way of example, the platform can be used: to deploy real-time enterprise automation triggers in order to commence or adjust workflows; to create custom code snippets to copy and paste into data-driven applications;  to quickly and easily expose real-time data phenomena which warrant immediate investigation and analysis by a human.

Ready to make sense of your data?

Just ask.


Find the relevant data with a few questions.

What can you see?


Real-time data made relevant.

Now you can act.


The Architecture

The Cinchapi Data Platform was purpose-built to make it easy to work with disparate data, to find otherwise hidden relationships between these sources, and to allow data scientists, analysts, and developers to “rewind time”. Much like a DVR for data, once these otherwise hidden relationships are uncovered, it is now possible to go back in time to see how these relationships were established and evolved over time.

Cinchapi Data Platform Architecture


The foundation of the Cinchapi Data Platform stack is the Concourse Database (ConcourseDB).  The ConcourseDB project was founded by Cinchapi, and we continue to maintain it.

ConcourseDB offers a unique array of features like automatic indexing and version control with distributed high performance ACID transactions. ConcourseDB offers the benefits of both graph and schema-less document-oriented databases in one system. ConcourseDB was built from the ground up to offer flexibility and scalability with minimal tuning. Simplicity and ease-of-use are first class design decisions.


Impromptu is a natural language based real-time analytics engine that enables anyone to visualize trends and get intelligence from any data source on demand.  Just ask questions with conversational queries, and get results.  Need to drill down?  No problem – Ask follow up questions to refine your results.


Cinchapi’s Sponge is where all connected data sources flow in order to be integrated into the Concourse Database. It uses machine learning to make sense of disparate, decentralized data sources. It greatly reduces data cleanup and preparation.