Using Apache Spark is optional, but you can unlock powerful use cases when you use Apache Spark with the Incorta Unified Data Analytics Platform.
Incorta Unified Data Analytics Platform integrates with Apache Spark, allowing you to run additional analytics while optimizing analytics performance. In this instance, after the initial data load to Apache Parquet, Apache Spark optimizes the data across Parquet allowing the loader to more efficiently move data to the analytics Engine.
The following sections contain more information about what Apache Spark is, how to configure it to work with Incorta, and how to troubleshoot Spark:
- Using Spark with Incorta. Learn more about what Apache Spark is, how Incorta works with Apache Spark, and how Apache Spark can enhance your Incorta experience.
The following pages explain key concepts in Apache Spark and terminology related to developing Apache Spark applications and administering an Apache Spark production cluster, related to Incorta.
- Configure Spark. Learn how to configure standalone Spark to work with Materialized Views, either bundled with Incorta or distributed among nodes.