Thursday, April 25, 2019

Create and Publish an Azure Function with Blob Trigger by using Visual Studio


Azure Functions and Visual Studio


Azure Functions triggered by Blob Trigger using Visual Studio 2017 – Hands on activity 


We have gone through that Azure Functions enables us execute small pieces of code or function on top of a Serverless environment as a cloud function. In simple words, required to focus on code and functionalities rather than worrying about cloud infrastructures. 

In last hands on activity, we walked through the creation of an Azure Function using the Blob trigger on top of In-portal platform that was a quick implementation and execution. Apart from the Azure Portal, you can develop Azure Functions using the latest release of Visual Studio 2017 version 15.4 or later. In fact, Azure Function Tools for Visual Studio 2017 is an extension for the VS IDE that lets you develop and deploy C# functions to Azure. Development under Visual Studio gives you lots of additional flexibility as well, where you can leverage all the IDE features including source control integration, testing, and other aspects. 

Have a look the following key benefits of Azure Functions Tools – 
  • Develop, build and run functions on the local development workstation.
  • You can publish the Azure Function directly to Azure.
  • You can use WebJobs attributes to declare function bindings in C# rather than maintaining a separate function.json file.
  • Develop and deploy pre-compiled C# functions.
  • During coding your functions in C#, you can leverage all benefits of VS IDE.


Here, you can see how to use the Azure Functions Tools for Visual Studio 2017 to develop C# functions and publish them to Azure. We will walk through a Blob trigger and validate the function using uploading files to the Blob container. However, we have already covered in many articles, the provisioning of an Azure Storage Blobs, so will focus the following tasks –
  • Develop C# functions and publish them to Azure
  • Configure Blob Trigger
  • Validate the function by using uploading files to the Blob container

Meanwhile, if required, then you can visit few previous articles to have a brief awareness about the following artifacts – 


Pre-requisites


Before moving ahead, we need some pre-requisites to accomplish this Azure Functions and Blob Trigger hands on activities by using Visual Studio 2017.
  1. Azure subscription, if you don't have an account then sign up for a free Azure account - https://azure.microsoft.com/en-gb/free/
  2. An Azure Storage account.
  3. Microsoft Azure Storage Explorer (optional), you can download and install the same using - https://storageexplorer.com/
  4. Latest version of Visual Studio 2017 - https://visualstudio.microsoft.com/vs/


Validate the Azure Development Workloads 


Azure Functions Tools comes under Azure development Workloads of Visual Studio 2017 version 15.5 or later. It is essentials to ensure that you included this feature during Visual Studio 2017 installation. If not, then might be you will have to modify the installation.

Visual Studio Professional 2017


Validate the Tools Version 


Open the Visual Studio 2017, move to the Extensions and Updates under Tools menu. Here expand Installed - > Tools and choose the Azure Functions and Web Job Tools.

Azure Functions and Web Jobs Tools


Note the installed version, in my case it looks like an updated version. However, if it exists as an old version in your case update the tools by using – Extensions and Updates – > Updates - > Visual Studio Marketplace

STEP – 1: Copy the Connection String of Storage Account 


Here, in the activity we will work with Blob trigger so one Storage Blob account is required. Since we have covered multiple times how to provisioning a Storage account, from now on just moving to an existing Storage account.

Login to the Azure portal https://portal.azure.com/

On the Hub menu, click All services and select the storage account, required to browse and load its blade in the Azure portal. Here you can see all details and options under different sections like Essentials and Services.

Storage Account

Proceed ahead by clicking the Access Keys under Settings; you can see Key and Connection string details. Copy the Connection string for the Key 1 section, which will be used further to create a connection using Visual Studio 2017.

Access Keys


STEP – 2: Create an Azure Function Project in Visual Studio


Start a new instance of Visual Studio and click the Project from New under the File menu that will launch New Project dialog box. Select Azure Functions that available under Installed - > Visual C# - >Cloud template from the New Project dialog box.

Azure Functions


Once you have submitted the desired function name and location proceed ahead by clicking the OK button that will launch function (trigger) selection page.

Azure Functions v1 (.NET Framework)


Here you can see different type of available functions, select the Blob trigger and fill the required property details as – 
  • Storage Account – You can select an existing storage account or go with Storage Emulator.
  • Connection String setting – Connection String of the Storage Account.
  • Path – It would be Blob Container

Select the Blob trigger and click the Browse from the drop down list of Storage Account.

Blob trigger

If you are not connected with the Azure account, then it will launch the blade through which you can sign in the same using Add an account.

Add an account


Once you have connected, and then can select the existing desired Storage Account for this function activity. Select the Storage Account and move ahead by clicking the OK button.

Storage Account


Sooner the selected Storage account will be mapped under the Storage Account, further submit the Connection String that you copied and note down in the previous steps. Along with this, you can provide a new Path that will be a new Blob Container.

Blob trigger - Storage Account

Next, the project creation proceeds, the time you click the OK button and within a few moments the default class and other basic structure will be appeared.

Solution - Visual Studio

STEP – 3: Validate Connection String


Now C# project has been created, contains the following files – 
  • Function1.cs – default class file for the Azure Function, the Blob trigger.
  • host.json –You can use this file for the configuration of Function host, apply both the local as well Azure.
  • local.settings.json – It maintains the settings for running the function locally.


Next, validate the Storage Connection String that you had provided in the previous steps. Go to local.settings.json file, here the connection string should be populated.

local.settings.json


STEP – 4: Validate Function Class


By default the Function1.cs class has been created, you can change the class name as per feasibility. Once you try to rename the class name, the system will ask a confirmation. 

Renaming a file


Just go ahead by clicking the Yes button, promptly the class name will be reflected in the new class name. Along with this, you can rename the name of the function also from Function1 to anything as per your choice.

Renaming Function1

Make ensure about the Connection, which should be the exact you had submitted in previous steps for Storage Account.

In the code section, you can see a simple log code that display the acknowledgement of the item, which is being uploaded to the Blob container, like name and size. It is just a sample; here you can do a different activity as per necessities. 

Log code inside function


STEP – 5: Publish the Function to Azure


Time to deploy the function, right click the project and click the Publish.

Publish


Promptly system will ask to pick a publish target once you click the Publish option. It gives following options – 
  1. Azure Function App
  2. Folder


Pick a publish target


Go ahead and select Create New under Azure Function App, and click the Publish button. Sooner you can get a Create App Service page, and the required details needed to provide for the following properties – 
  • App name – Name that identifies your function app.
  • Subscription – Name of your Azure subscription, it would be either free, paid or specific subscription etc. 
  • Resource group – Name of the resource group used to organize related resources, you can either create a new resource or choose an existing one.
  • OS – Go with Windows because Serverless hosting is currently only available when running on Windows.
  • Hosting Plan – Go with the default or create a new Consumption Plan.
  • Storage Account– Go with the default dedicated Storage account used by your function app or create a new one. 


Create App Service


Once you select the preferred Hosting Plan then click the OK button, almost the App Service is ready to be provisioned.

Create App Service

Validate the submitted details and go ahead by clicking the Create button to create the Azure Function App instance. Sooner the App will get started to deploy, will take a few moments to configured Azure. Shortly, you will get Publish page synched-up the function and Azure.

Publish Page


Even you can validate the Azure Function App on the Azure portal, move to the All resources and select the same Function App that was created using Visual Studio.

Function App blade


Here you can see all details, like the Site URL and function details inside the trigger. You can validate the Function App, copy the URL and run in the browser.

Function App is running


Congratulations, your Function App is up & running! 😊

STEP – 6: Connect to Microsoft Azure Storage Explorer 


Now you have a Blob trigger and a Storage account, you can validate the function by uploading a file to the container. Move to Microsoft Azure Storage Explorer utility, click the connect icon on the left that will launch Connect to Azure Storage page contains a couple of options.

Connect to Azure Storage

Here select the Use a connection string option and click the Next button that will launch Attach with Connect String blade. Here required to submit the same Connection string, which you had copied and noted down in previous steps

Attach with Connection String


Proceed ahead to be connect by clicking the Next button, once you provided the Connection string, shortly you can see summary details.

Connection Summary


Click the Connect, if you see all details are appearing correct. 

STEP – 7: Create Blob Container 


Navigate to the storage account and expand it, and select the storage connector and right click the Blob Containers, move ahead by clicking the Create Blob Container.

Create Blob Container


Next, provide the same name that you had given during creation of a function named function-data under Visual Studio; it will create a Blob Container with this name.

Blob Containers

STEP – 8: Validate the Function 


Now desired Blob container exists and you can test the function by uploading a file to the container. Navigate back to the Azure Portal; select the Function App that was published from the Visual Studio end. 

Expand the Logs under that comes at the bottom of the function page. Before proceeding ahead, make sure, the log streaming is in running mode.

Logs

Next, move to Storage Explorer and select the newly created Blob Containers the function-data, click Upload and upload files.

Upload files


Further, click the Upload button once you selected file through the Upload files dialog box, quicker you can see the list of the uploaded files.

File Uploaded

Go back to your function logs and verify that the Blob trigger function has been processed and the blob has been read subsequently.

Logs


Congratulations, Blob trigger function executed successfully! 😊

Here it was a quick walk-through on how we can create Azure Functions by using the Visual Studio 2017, though there are different types of triggers that you can explore more. Further, I will come with more articles synched-up with different Azure services.  

keep visiting the blog.


Wednesday, April 24, 2019

A brief walk-through of Azure Cosmos DB


Azure Cosmos DB


Azure Cosmos DB – an Introduction


In the perspective of the quick response of an application, that is required to be low latency and high availability. In this context, the instances are expected to deploy in datacenters which are closest to the users. Motto behind this, to achieve a highly responsive application eco-system in real time scenario and the data available to users within milliseconds.

If I talk about the better management of data by using one of Azure services, then undoubtedly Azure Cosmos DB, a multi-model database service makes a significant impact. Azure Cosmos DB is a globally distributed database service, which allows you to manage your data elastically and independently scale throughput and storage throughout the world. In 2014, Microsoft introduced its first cloud-based NoSQL database called Azure Document DB. It was a document oriented NoSQL database offering SQL like querying interface for retrieving the document data. Later the Azure Cosmos DB, launched in 2017 and is a progression on top of its ancestor Azure Document DB.

In brief, Azure Cosmos DB (formerly known as Document DB) is a multi-model NoSQL database in an Azure cloud platform that offer to store and process massive amount of structured, semi-structured and unstructured data. It provides native support to  various platforms to access your database like, MongoDB APIs, Cassandra, Azure Tables, Gremlin and SQL.

Azure Cosmos DB


NoSQL – an Introduction


I talk about Azure Cosmos DB, which is a multi-model NoSQL database that provides independent scaling across all Azure regions. If you are conceptually aware about the term NoSQL then it would be really easier to grasp Cosmos DB, though no then we can discuss briefly herewith.

NoSQL


NoSQL database is not new, it has been around for quite some time. NoSQL database stands for Not Only SQL, or Not SQL, the NoSQL concept was introduced by Carl Strozz in 1998. It is a Non-Relational database management system, refers to all databases and data stores that are not based on Relational Database Management Systems (RDBMS) principles.  NoSQL does not require a fixed schema, avoids joins, and is easy to scale. NoSQL is designed for distributed data stores where dealing with huge volume data is required. Such data are stored may not require fixed schema, avoid joining operations and typically scale horizontally.

In fact, NoSQL is used for Big Data and real-time web applications like, Google, Facebook, Amazon, etc. engaged with terabytes of data on a daily basis. Here system response time becomes slow when you use an RDBMS for massive volumes of data. Technically, to fix this issue, you can go with either Scale Up (Vertical) or Scale Out (Horizontal) system. 
  • Scale Up – It means, upgrading our existing hardware.
  • Scale Out – It means distribute database load on multiple hosts whenever the load increase.


Since Scale Up (Vertical) approach is a bit expensive approach, and NoSQL database is non-relational, so it Scales Out (Horizontal) better than relational databases. In the context of traditional RDBMS where SQL syntax is being used to store and retrieve data, a NoSQL database system incorporates as a wide range of database technologies that can store structured, semi-structured and unstructured data.

Scale Up - Out

Following are the key features of NoSQL database – 
  • Non-relational data model
  • Runs well on cluster
  • Mostly open-source
  • Built for new era of Web applications
  • Schema-less
  • Not ACID Compliant
  • Supports horizontal scaling


Basically, there are four basic types of NoSQL databases – 
  1. Key Value Based – It is one of the simples NoSQL databases,  having a big hash table of keys and values and can be easily looked up to access data. 
  2. Document Based – In this type, the key is paired with a complex data structure called as a document. It treats a document as a whole and avoid splitting a document in component name/value pairs.
  3. Column Based – It allows data to be stored effectively, used to store large data sets. Here, each storage block contains data from only one column.
  4. Graph Based – It is a network database that uses nodes, edges and properties to represent and store data.


NoSQL database types

Azure Cosmos DB Features


In the previous section, we came to know that NoSQL databases are the type of databases which stores and retrieves data in a different way as compared to traditional relational database. Microsoft enhanced Document DB and came up with Azure Cosmos DB notified as a globally distributed, multi-model database.

Following are the key capabilities of an Azure Cosmos DB – 
  • Global Distribution – Cosmos DB seamlessly replicates your data across Azure geographical regions that ensures availability and low latency.
  • High Scalability – In the context of massive data, horizontal Scaling (Scale Out) makes Cosmos DB more scalable and durable, designed to manage elastically scaled throughput worldwide.
  • Low Latency – Cosmos DB guarantees less than 10 milliseconds latency for both, reading and writing (indexed) data, all around the world.
  • Multi-model and Multi-API database – Azure Cosmos DB is based on the atom (small set of primitive data types) – record (structures composed of the types stored in an atom)-sequence (arrays consisting of atomes, records, or sequences) data model that supports multiple data model like documents, table, graph, key-value pairs, etc.
  • Multiple Consistency Models – It offers a spectrum of five consistency levels (Strong, Bounded Staleness, Session, Consistent Prefix & Eventual) based on the consistency and availability trade-offs. 
  • Schema-free – Cosmos DB automatically indexes all the data it consumes without requiring any schema. Since no schema and index management is required, you also don’t have to worry about application downtime while migrating schema.
  • Tooling – Apart from the different APIs, you can utilize strong tools that simplify a lot of operations, like dtui.ex, dt.exe, DB Emulator, DB explorer, Capacity Planner etc.


Azure Cosmos DB key capabilities

Multi APIs Support 


In earlier version of Azure NoSQL DB (i.e. DocumentDB), only JSON documents were supported. But if I talk about development with Azure Cosmos DB, you  can get multiple type of APIs and by using these you can store and process different type of data stores accordingly, like – 
  • Table
  • Key-Value
  • Document
  • Graph
  • Column


Azure Cosmos DB provides the flexibility to go with a variety of APIs to access the following data, with SDKs available in different languages – 
  1. SQL API – Azure Cosmos DB has native support to quering documents or items using SQL and JSON data. It mainly treats entities as JSON documents.
  2. MongoDB API – Primarily it is used to communicate between Azure Cosmos DB and applications written for MongoDB. It mainly works with MondgoDB’s binary version of JSON called as BSON. By minimal changes and native supports, you can migrate MongoDB based applications to Cosmos DB.
  3. Cassandra API – It is based on Columnar data storage feature, by using this API Cassandra based applications can be migrated over to Cosmos DB.
  4. Graph (Gremlin) API – In the context of data annotation with meaningful relationships, Gremlin API can be used. It supports modeling Graph data and provides APIs to traverse through the graph data.
  5. Table API – Basically, this API is a progression to Azure Table Storage, and applications using Azure Table Storage can be migrated to Azure Cosmos DB with no code changes.


At a brief glance, all these API looks different in term of structure, though they all share the same capabilities of Cosmos DB. Selection of API is entirely up to requirement and your decision, mostly depends on the data you are going to deal.

Cosmos DB APIs

Azure Cosmos DB – brief Technical Outline


You can create an Azure Cosmos DB by provisioning a database account (we will cover in the next hands on activity’s post), which is in fact, manages one or more databases. Basically, an Azure Cosmos DB manages users, permission and containers subsequently. Azure Cosmos DB container is a schema-less container that contains different entities, stored procedures, triggers and user-defined functions, etc. In fact, entities under the database account that contains databases, users, permissions, containers, etc. are referred to as resources.

In simple words, Azure Cosmos DB follows this hierarchy – 

Cosmos DB Container


Depending on the above discussed APIs, container and item resources are projected as specialized resource types. A container is horizontally partitioned and replicated across multiple regions. Under  the container and the throughput that you provisioned, each item distributed across a set of logical partitions based on the partition key automatically.

Cosmos DB resource model

Here, briefy, we covered that an Azure Cosmos DB is a globally distributed, highly scalable and multi-model service. Azure Cosmos DB enables you to elastically scale throughput and storage across the geographical regions. No doubt, we didn’t deep dive into technical overview or in the context of architecture approach, though you can go through via Microsoft’s article to deep dive – 


Along with this conceptual brief, in the next post we will leverage Azure Cosmos DB by using some hands on activities, keep visiting the blog.


Tuesday, April 16, 2019

Create an Azure Function and triggered by a Blob Trigger


Serverless - Azure Functions


Azure Functions triggered by Blob Trigger – Hands on activity 


In a previous article, we came across the Serverless Computing approach that allows you to build and run applications and services without thinking about hardware, software or any that type of bits & pieces. In fact, Serverless computing is an event-driven application design and deployment model in near-real time, where computing resources provided as scalable cloud services.

Once we talk about Serverless, it means the code is running essentially on this platform and it is waiting for execution orders. Azure Functions is one of Serverless components that serves a platform to execute code based on events that you specify. By using Azure Functions, you can execute a script or piece of code in response to a variety of events and you do not have to worry about the infrastructure platform.

Microsoft Azure Functions


Next, onwards in this article, we will deep dive to do some hands on activity with Azure Functions and trigger. Here, you can see how to execute Azure Functions triggered by the Blob Trigger, whenever any file being uploaded to a Storage Blob. Since we have already covered in one of the articles, the provisioning of an Azure Storage Blobs, so will focus the following tasks – 
  • Create an Azure Function
  • Configure Blob Trigger
  • Execute and validate Azure Function by using uploading files to Storage Blob.


Meanwhile, if required, then you can visit few previous articles to have a brief awareness about the following artifacts – 


Pre-requisites


Before moving ahead, we need some pre-requisites to accomplish this Azure Functions and Blob Trigger hands on activities on top of Azure Cloud.
  1. Azure subscription, if you don't have an account then sign up for a free Azure account - https://azure.microsoft.com/en-gb/free/
  2. Microsoft Azure Storage Explored, you can download and install the same using - https://storageexplorer.com/


STEP – 1: Create an Azure Function App 


Login to the Azure portal https://portal.azure.com/

In the Microsoft Azure portal, click the + Create a resource from the Hub and click the Compute from the Azure Marketplace. It will load all available services under the Featured section, select the Function App or can search from the search box.

Function App


Promptly the Function App blade will be loaded, the time you click the Function App and the required details needed to submit for the following properties – 
  • App name – Name that identifies your function app.
  • Subscription – Name of your Azure subscription, it would be either free, paid or specific subscription etc. 
  • Resource group – Name of the resource group used to organize related resources, you can either create a new resource or choose an existing one.
  • OS – Go with Windows because Serverless hosting is currently only available when running on Windows.
  • Hosting Plan – Go with the default Consumption Plan, resources are added dynamically as required by your functions.
  • Location – Ideally, select the same location as your storage account.
  • Runtime Stack – Choose a preferred language, will go with the default .NET for C#.
  • Storage – Create a new or use an existing storage account used by your function app.


Azure Function

Next, required to configure the Application Insights resource of the same App name. By expanding this setting, you can change the New resource name or choose a different Location.

Application Insights


Now, nearly all properties have filled up; proceed by clicking the Create button to create the Azure Function App instance. Post validation, it will be deployed within few minutes. In the Azure Portal, you can view the notification and sooner you will be notified once Function App is created successfully. 

Once deployed, go and view the app.

Azure Function Created

STEP – 2: Choose a Development Environment 


Now Azure Function App has created, time to create a desired function. Expand your function app and click the + button next to Functions or can click the + New function from the Overview section.

Function Overview


By clicking the + button, Quickstart tab will be launched, and you can get the Azure Functions for .NET - getting started as a label.

Function - Quickstart


I trust, this is the first function in your function app, so scroll down a little bit more and select the In-portal and proceed by clicking the Continue button.

In-portal

STEP – 3: Choose Storage Blob trigger 


Once you click the Continue button, Azure will launch few by default trigger or a template of the function. However, we need to create a Blob based trigger so go ahead and click the More templates, then Finish and view templates subsequently.

More templates


It will display all available template, required to scroll the templates and choose the Azure Blob Storage trigger or you can type blob in the search field and choose the same.

Azure Blob Storage trigger template


Once you select it, might be it prompted as Extensions not Installed.  Go ahead and install the same by clicking the Install button. 

Extension Installation

Installation can take some time depends on template dependencies, once see the Continue button, then proceed ahead by clicking this.

Extension Installation

STEP – 4: Create a Function 


In previous steps, you launched the Azure Blob Storage trigger blade, now time to create a new function by providing following properties details – 
  • Name – Unique function name.
  • Path – The path within your storage account that the trigger will monitor.
  • Storage account connection - You can use the storage account connection already being used by your function app, or create a new one.


New Function


If required to create a new storage, proceed further by clicking the new link that will launch a new Storage Account blade.

Storage Account


Here you can see a proposed account; select the same that will be synced-up with this function and provide a name for the container.
  
New Function


Go ahead and click the Create button to finalize the settings. Later on the move in your function, click Integrate, expand Documentation, and validate the Account name and Account key that already pre-populated, copy the same for further connecting the storage account.

Documentation

STEP – 5: Connect to Microsoft Azure Storage Explorer 


Now you have a blob trigger and a blob container, you can validate the function by uploading a file to the container. Move to Microsoft Azure Storage Explorer utility, click the connect icon on the left that will launch Connect to Azure Storage page contains a couple of options.

Connect to Azure Storage


Now select the Use a storage account name and key option and click the Next button that will launch Connect with Account Name and Key blade. Here required to submit the Account name and Account key details, you can enter the same details, which you had copied in the last steps during the function creation.

Connect with Name and Key


Proceed ahead to be connect by clicking the Next button, once you provided the account details, sooner you can see summary details.

Connection Summary


Click the Connect, if you see all details are appearing correct. 

STEP – 6: Create Blob Container 


Navigate to the storage account and expand it, and select the storage connector and right click the Blob Containers, move ahead by clicking the Create Blob Container.

Create Blob Container


Next, provide the same that you had given during creation of a function named function-data; it will create a Blob Container with this name.

Create Function


Now blob container exists and you can test the function by uploading a file to the container.

Container Created


STEP – 7: Validate the Function 


Navigate back to the Azure Portal; expand the Logs under that comes at the bottom of the function page. Before proceeding ahead, make sure, the log streaming is in running mode.

Logs


Next, move to Storage Explorer and select the newly created Blob Containers the function-data, click Upload and upload files.

Upload files

Further, click the Upload button once you selected file through the Upload files dialog box, sooner you can see the list of the uploaded files.

Files Uploaded

Go back to your function logs and verify that the Blob trigger function has been processed and the blob has been read subsequently.

Function Triggered

Congratulations, Blob trigger function executed successfully! 😊

Here you have gone through some hands on activity on top of Azure Function and Storage Blob by creating a function that runs when a blob is being added to or updated in Blob storage. Since the Function App runs in the default Consumption plan, so a delay expected between the blob being added or updated and the function being triggered.

In the next article, we will walk through the Visual Studio and Azure Function hands on activities. Keep visiting the blog.