Using Spark is optional, but you can unlock powerful use cases when you use Spark with Incorta Analytics.
Incorta Analytics environment integrates with Spark, allowing you to run additional analytics while optimizing analytics performance. In this instance, after the initial data load to Parquet, Spark optimizes the data across Parquet allowing the loader to more efficiently move data to the analytics Engine.
The following sections contain more information about what Spark is, how to configure it to work with Incorta, and how to troubleshoot Spark:
- Using Spark with Incorta. Learn more about what Spark is, how Incorta works with Spark, and how Spark can enhance your Incorta experience.
The following pages explain key concepts in Spark and terminology related to developing Spark applications and administering a Spark production cluster, related to Incorta.
- Configure Spark. Learn how to configure standalone Spark to work with Materialized Views, either bundled with Incorta or distributed among nodes.