copyright | lastupdated | subcollection | ||
---|---|---|---|---|
|
2023-02-06 |
AnalyticsEngine |
{:new_window: target="_blank"} {:shortdesc: .shortdesc} {:codeblock: .codeblock} {:screen: .screen} {:pre: .pre}
{: #getting-started}
{{site.data.keyword.iae_full_notm}} Serverless instance is allocated to compute and memory resources on demand when Spark workloads are deployed. When an application is not in running state, no computing resources are allocated to the {{site.data.keyword.iae_full_notm}} instance. Pricing is based on the actual amount of resources consumed by the instance, billed on a per second basis.
{: #getting-started}
The {{site.data.keyword.iae_full_notm}} Standard Serverless plan for Apache Spark offers the ability to spin up {{site.data.keyword.iae_full_notm}} serverless instances within seconds, customize them with library packages of your choice, and run your Spark workloads.
Currently, you can create {{site.data.keyword.iae_full_notm}} serverless instances only in the US South region.
{: #getting-started-1}
To start running Spark applications in {{site.data.keyword.iae_full_notm}}, you need:
- An {{site.data.keyword.Bluemix}} account.
- Instance home storage in {{site.data.keyword.cos_full_notm}} that is referenced from the {{site.data.keyword.iae_full_notm}} instance. This storage is used to store Spark History events, which are created by your applications and any custom library sets, which need to be made available to your Spark applications.
- An {{site.data.keyword.iae_full_notm}} serverless instance.
{: #getting-started-2}
To provision an {{site.data.keyword.iae_full_notm}} instance:
- Get a basic understanding of the architecture and key concepts. See Serverless instance architecture and concepts.
- Provision a serverless instance
{: #getting-started-3}
To run Spark applications in a serverless {{site.data.keyword.iae_full_notm}} instance:
- Optionally, give users access to the provisioned instance to enable collaboration. See Managing user access to share instances.
- Optionally, customize the instance to fit the requirements of your applications. See Customizing the instance.
- Submit your Spark application by using the Spark application REST API. See Running Spark batch applications.
- Submit your Spark application by using the Livy batch API. See Running Spark batch applications using the Livy API.
{: #getting-started-4}
To help you get started quickly and simply with provisioning an {{site.data.keyword.iae_short}} instance and submitting Spark applications, you can use the {{site.data.keyword.iae_short}} serverless CLI.
For an end-to-end scenario of the steps you need to take, from creating the services that are required, to submitting and managing your Spark applications by using the Analytics Engine CLI, see Create service instances and submit applications using the CLI.