\At Datacoral, we have had a very interesting transition to an all-remote team during the craziness that has been the COVID Quarter Q2 2020. We even moved our weekly yoga sessions to Zoom! We did a 24 hour remote hackathon where folks in the engineering team worked on several cool ideas to improve our developer velocity as we continue to work remotely.
- We now offer AWS credits for trials to a fully-featured secure free trial!
- We achieved Amazon Linux-2 Ready Designation this quarter.
- Our founder’s paper on Hive won the IEEE ICDE 2020 Influential Paper Award for 2020.
We have made several updates to the Datacoral product!
New and updated connectors
- Drift – Drift is a fast growing conversational marketing platform. Datacoral is one of the very few data integration providers to support it!
- JDBC – Datacoral added a JDBC connector to support several SQL dialects out of the box – OpenAccess, Oracle, PostgreSQL, MySQL, Microsoft SQL Server.
- Netsuite – We now support syncing over 500 objects out-of-the-box from Netsuite using JDBC!
- PostgresSQL and MySQL Change Data Capture connectors now have a standardized mechanism to handle all schema changes at the source.
Data quality checks
- Datacoral now allows you to use SQL or Python to specify arbitrary data quality checks on your tables to ensure there are no badly formatted values, missing values, and even foreign key violations.
- Data quality checks can be added as additional triggers for transformations to ensure all analyses are run only on good quality data.
Integrate Datacoral into your workflows
- Our data triggered orchestration allows our customers to quickly build data pipelines by writing only SQL. You can plug into the same data events to drive your own workflows using tools like Airflow.
- Get more visibility via metadata events that are generated when schema changes are detected at the source such as when tables or columns are added or removed.
- Get alerts in Slack for both data and metadata events
Python Batch Transform Functions
- Use Python transformations using arbitrary libraries within your SQL-based data pipelines.
- Our customer, Jyve, wrote a blog post how they have been able to use this feature to incorporate geospatial indexing easily into their pipelines.