![]() ![]() For example, you can attach notebook and Spark job definitions to corresponding Spark pools. Spark pool: All running artifacts can use packages at the Spark pool level. You can add more packages at the other levels. When a Spark instance starts, these libraries are included automatically. For a full list of libraries, see Apache Spark version support. There are three levels of packages installed on Azure Synapse Analytics:ĭefault: Default packages include a full Anaconda installation, plus extra commonly used libraries. To make third-party or locally built code available to your applications, install a library onto one of your serverless Apache Spark pools or a notebook session. Your team has built a custom package that you need available in your Apache Spark pool.A better package is available, and you no longer need the older package.You need an extra package for training your machine learning model or preparing your data.One of your core dependencies released a new version.You might need to update your serverless Apache Spark pool environment for various reasons. Libraries provide reusable code that you might want to include in your programs or projects for Apache Spark in Azure Synapse Analytics (Azure Synapse Spark). ![]()
0 Comments
Leave a Reply. |