Tag

microsoft

Browsing

The latest Azure certification exams are demanding and need a lot of demos and hands-on training. During the preparations and experiments, an e-commerce Azure Saas Platform was formed unintentionally.

Taking Microsoft certification exams has been apart of my professional career since 2007. Certifying myself not only verifies the current level of my knowledge but also makes me study hard for newly available technology. I’m currently on the DevOps journey with the following exams:

Azure certification paths to create a SaaS product
Azure certification paths in 2019

Experiments turned to a SaaS platform

There are many guides and blog posts on how to study for exams and what materials to read. I find it best to read Microsoft documentation, create demos and have experiments. During the study period, my demos and architectural decisions became a fully functioning Azure SaaS platform. I’m still on the study path, so there is a room for upgrades and changes in the architectural decisions.

The demo I have created is an e-Commerce solution using Azure products and services, which are a part of measured skills in exams. To improve processes and add an extra layer to the intelligence of the application, Microsoft provides AI tools part of Microsoft Cognitive Services which I plan to include in the platform. The trial by applying AI services provided by Microsoft will also indicate the maturity of these services and a way to observe how enterprise solution-ready they are.

I have kept the Start-up mentality in mind by minimizing the costs of services on Azure as much as possible. The plan is also to include cost calculations about used services in my upcoming posts. The DevOps methodologies and tools will also be an active part of the process. DevOps helps to keep everything as simple as possible and automate the most processes as possible.

The SaaS platform is currently running on https://obrame.azurewebsites.net. “Obra” means “work” in Spanish, which is the current language of the service.

Upcoming blog posts will explain the processes of the SaaS solution. The posts will also go through important Azure services and their role in the technical implementation. The next blog post will explore the high-level architecture and initial services used to run the application in the Microsoft cloud.

The process for creating Azure Functions is straightforward on the Azure Portal. The only confusing option you have to consider during the function creation is which hosting model to choose from the available choices. There are four different hosting plans to choose from, where you will also be able to determine which OS to host your functions. In this blog post, I’ll have a review of different choices and what suits you best. This is what you will see on Azure Portal when choosing your hosting plan:

Azure Functions hosting plans for each OS.

Consumption plan

Consumption plan is open on both Windows and Linux plans (Linux currently in the public preview). If you are new to the Azure Functions or need the function just up and running, I would recommend picking this plan, as it will make your life easier and you can get to the coding part rapidly. With this option, the function will dynamically allocate enough compute power or in other words, hosts to run your code and scale up or down automatically as needed. You will pay only for the use and not when for idle time. The bill is based aggregated from all functions within an app on the number of executions, execution time and memory used.

App Service Plan

App Service Plan is the second choice both available on Windows and Linux OS. This plan will dedicate you a virtual machine to run your functions. If you have long-running, continuous, CPU and memory consumable algorithms, this is the option you want to choose to have the most cost-effective hosting plan for the function operation. This plan makes it available to choose from Basic, Standard, Premium, and Isolated SKUs application plans and also connect to your on-premises VNET/VPN networks to communicate with your site data. In this plan, the function will not consume any more than the cost of the VM instance that is allocated. Azure App Service Plans can be found from Microsoft’s official documentation.

An excellent example for choosing the App Service Plan is when the system needs continuously crawl for certain data eighter from on-premises or the internet and save the information to Azure Blob Storage for further processing.

Containers
Azure Functions also supports having custom Linux images and containers. I’ll dedicate a blog post for that option shortly.

Timeouts

The function app timeout duration for Consumption plan by default is five minutes and can be increased to ten minutes in both version one and two. For the App Service plan version one has an unlimited timeout by default but the time out for version two of functions is 30 minutes which can be scaled to unlimited if needed.

After creating the function with a particular hosting plan, you cannot change it afterwards, and the only way is to recreate the Function App. The current hosting plan on the Azure Portal is available under the Overview tab when clicking on the function name. More information about pricing can be found from the Azure functions pricing page.

One of the most popular Azure features is Azure App Services and the Platform as a Service (PaaS) architecture approach. It merely removes the overhead of setting up additional infrastructure, speeds up to get apps up and running and is an economical solution for hosting user faced web apps or API solutions for the web or mobile apps. For the last few years, App Services has played a significant role in the architecture and services I design for the customers.

As the need for background processes increased, Microsoft introduced Azure WebJobs as a part of Azure Web Apps, and it was the first step towards the functional serverless architecture. A WebJob is a workflow step which has a trigger based on time or, e.g. Azure storage features to undertake a specific logical task. WebJobs are a powerful tool to process data and create further actions based on business rules. The downside is that it has a poor modification, monitoring and laborious logging features from the UI compared to Azure Functions.

By publishing the Functions to Azure, it was a game changer in architectural plans and the way handling background processes in the Microsoft cloud. Azure Functions are hosted on-top of Azure Web Apps architecture and can trigger by HTTP requests, time schedules, events in Azure Storage, Service Bus or Azure Event Hub. The full introduction to Functions is available on Microsoft’s documentation.

Functions Apps can be created using developers prefered programming languages like C# or JavaScript either from the Azure Portal using the web editor or using Visual Studio. Cross-platform developers can use the Visual Studio Code for development using their non-windows environments.

Azure Functions have two runtime versions, and there are significant differences between versions one and two. Version 2.X is running in a sandbox, and it will limit access to some specific libraries in C# and .net core. As an example, if your function is manipulating images or videos, you don’t have access to the framework GUI libraries, and you will face exceptions. The version 1.X uses the .NET Framework 4.7 and is a powerful and alternative runtime for processes where full access to .NET Framework libraries are needed. The full list of supported languages and runtimes are available on the Microsft’s documentation.

Here is an example of the usage of Functions:
A client has financial data in different file formats and needs to process the information. The client receives most of the data in text-based PDF format. Using Functions is a perfect way to process textual context from PDF files to create data for search and Artificial intelligence. The following drawing illustrates the architecture.

  1. Azure blob storage to host files and PDF documents
  2. Azure Function which will be triggered as a new file is added to a container
  3. Azure Cosmosdb to save the content of the PDF file as JSON format
  4. Azure Cognitive Services to process textual context

Over a decade of being a part of Microsoft ecosystem and solving customer’s needs, one of the most challenging tasks has been the engagement of customers and providing self-service systems to the end-users. In the old days and even currently some enterprises refer to the system as an “Extranet”. The cloud era and especially Microsoft’s Azure and the seamless integration between services has eventually changed the world.

Few years ago to have an environment where customers could authenticate, update their personal and professional information, interact with the enterprise and provide documents could cost hundreds of thousands euros. In most cases SharePoint was acting as the extranet platform, secondary Active Directory as the identity management system and Dynamics CRM as customer management system to hold customer information. Not to mention the integration platforms to solve the needs of communication between systems. For the risk, reliability, stability and usage load management all the platforms and systems had to be in a farm and at least duplicated for testing purposes. In most cases the capacity provided for the environments was frequently in the idle mode. The drawing below demonstrates a simple architecture of an on-premises server farm environment.

The minimal architecture

The costs mentioned above were only the hardware costs for the start.  The other costs for the project were the development and the maintenance fees which were much greater than budgets planned for the “Extranet” projects these days. The main reason for the bigger expenses was the custom made code created for platforms. Nowadays the need of custom coding is much smaller and most of the custom features are a part of the platforms.

Last fall Microsoft acquired a company called Adxstudio Inc. Since then I have been following the main cloud-based portal product of Adxstudio. The product is built on-top of Microsoft Dynamics CRM/XRM which is nowadays under the Dynamics 365 brand. User authentication is handled by the Azure Active Directory but user profiles are stored in the XRM as a contact record which can  interact with other entities in the CRM context. CRM entities and actions are extended to the web and information gathering is made amazingly easy. The product provides Content Management capabilities and search to create richer information management in the portal. As the technical perspective the UI is created by modern HTML 5 and CSS3 web technologies by using the Bootstrap framework to provide responsive mobile web pages.

But what concepts and features can be provided by the Dynamics 365 Portal to the end-users?
Here is a list of some concepts and ideas:

  • Customer service: help desk, account management and knowledge base
  • Communities and information sharing: discussion forums, idea management, polls and surveys
  • E-commerce: Transactions, invoice and order management, product and quotes management
  • Government: Services provided to citizens or emergency management
  • Marketing: Branding and design, conference and event management and lead generation forms

Each portal instance is hosted on Microsoft Azure and has some integration support for other Microsoft cloud based products like SharePoint. To keep integrations in mind, the product supports fully REST API and JavaScript can be used for the AJAX calls.

Currently the portal costs $500/year/instance but with each Dynamics 365 subscription the customer will get one instance of Dynamics Portals for free!
To refresh our memory each instance requires a Dynamics 365 CRM instance to which the portal will be attached during the installation/deployment phase. Corporate staff should have a CRM licence to be able use the portal but there can be unlimited amount of  external users for free!

In my next blog posts related to Dynamics 365 Portals I’ll go through the deployment process and features available in the product. My goal is to evaluate the product and give the business and technical staff better understanding of the product.