.NET Backend Development Patterns
DevelopmentThis skill provides guidance on mastering C#/.NET backend development patterns for building robust APIs, MCP servers, an...
This skill allows you to interact with Azure Data Lake Storage Gen2 using the Python SDK. ADLS Gen2 provides a hierarchical file system built on top of Azure Blob Storage, ideal for big data analytics workloads. This skill enables you to perform a wide range of operations, including creating and managing file systems, directories, and files, as well as setting access controls and metadata. It simplifies integrating your Python applications with Azure's robust data lake solution, enabling you to efficiently store, process, and analyze large datasets.
This skill provides access to Azure Data Lake Storage Gen2 functionality through Python code. It allows for creating, reading, updating, and deleting files and directories, managing access controls, setting metadata, and listing contents within the data lake. It also supports asynchronous operations for enhanced performance.
When needing to store and process large datasets in Azure.
When building data pipelines and analytics solutions using Python.
When requiring a hierarchical file system for organizing data.
When managing access control and permissions for data within Azure.
When integrating Python applications with Azure Data Lake Storage Gen2.
Azure Storage Account URL, credentials (e.g., DefaultAzureCredential), file system names, directory paths, file paths, and data to upload.
File system objects, directory objects, file objects, file content, properties, access control lists, metadata, and lists of paths.
1. Open Cursor IDE.
2. Create a new Python project.
3. Install the Azure Data Lake Storage Gen2 SDK: `pip install azure-storage-file-datalake azure-identity`.
4. Import the required modules and use the provided code snippets to interact with your Azure Data Lake Storage Gen2 account.
5. Set the AZURE_STORAGE_ACCOUNT_URL environment variable.
1. Install Python 3.6 or later.
2. Install the Azure Data Lake Storage Gen2 SDK and Azure Identity library: `pip install azure-storage-file-datalake azure-identity`.
3. Set the `AZURE_STORAGE_ACCOUNT_URL` environment variable to point to your Azure Data Lake Storage Gen2 account endpoint.
4. Authenticate using `DefaultAzureCredential` or other appropriate credential types.
5. Use the provided code examples to perform operations on your data lake.
1. Install Python 3.6 or later.
2. Install the Azure Data Lake Storage Gen2 SDK and Azure Identity library: `pip install azure-storage-file-datalake azure-identity`.
3. Set the `AZURE_STORAGE_ACCOUNT_URL` environment variable to point to your Azure Data Lake Storage Gen2 account endpoint.
4. Authenticate using `DefaultAzureCredential` or other appropriate credential types.
5. Use the provided code examples to perform operations on your data lake.
Discover more AI agent skills in the same category to enhance your workflow automation.
This skill provides guidance on mastering C#/.NET backend development patterns for building robust APIs, MCP servers, an...
This skill provides expert-level guidance and tooling for SQL database migrations, focusing on zero-downtime deployments...
This skill allows you to interact with the Notion API, enabling you to create, read, update, and delete pages, databases...
This skill provides expert patterns for working with Neon, a serverless Postgres platform. It covers essential aspects l...
Build read models and projections from event streams. Use when implementing CQRS read sides, building materialized views...
BluOS CLI (blu) for discovery, playback, grouping, and volume....
Join the community and help AI agents learn new capabilities. Submit your skill and reach thousands of developers.