Category Archives: Datacenter Management

The importance of a modern approach to networking and effective network governance in the cloud era

Networking is one of the pillars in the IT world, because it supports the infrastructure, allows the exchange of all the data necessary for the business, both inside and outside the company, and enables the creation and adoption of new solutions. You can easily understand how networking is a delicate area, complex and constantly evolving. However, what we are witnessing in many companies is the obstinacy to a traditional approach to networking that is now limiting and not very effective. This article lists the main challenges of a traditional approach to networking in the modern era and gives some suggestions for adopting a different approach and for structuring effective network governance.

The challenges of traditional networking in the modern era

Going into the merits of the main challenges that customers face every day in the networking field we find:

  • an increase in complexity and management effort: the rapid proliferation of cloud environments, of mobile devices and the IoT has effectively eroded the boundaries of modern networks, making them more difficult to manage and more vulnerable;
  • expansion of the attack surface: in this regard, the question to which it is advisable to be able to answer is «how is it possible to guarantee effective network protection without interfering with the growth and fluctuations of workloads in cloud and multi-cloud environments?»;
  • fragmented and inconsistent visibility and integration between local data centers and cloud environments: adding isolated monofunctional network products to deal with communication problems increases the complexity, IT staff costs and workload;
  • changes in branch office connectivity: the trend of corporate realities, geographically distributed, sees the replacement of expensive MPLS connections with more affordable direct Internet connections, but which do not always allow to reach the same levels of quality and performance.

All of this translates into specific critical points that I have encountered with our customers over time:

  • High costs and a lot of complexity
  • Many network solution vendors with poor integration
  • Too many alerts with slow and manual responses
  • Lack of properly trained internal IT personnel

Adopting a modern approach to networking

In the light of these considerations, it becomes essential to adopt a modern approach to networking able to better face all these challenges, going to reduce complexity and improve efficiency. Identify and implement network architectures designed for digital transformation must occur through:

  • A networking based on security which guarantees and speeds up the network and user experience.
  • A dynamic and transversal management of any environment to secure and control on-premises infrastructure and applications, in hybrid environments and public clouds.
  • Integrated solutions to connect the components of the entire network infrastructure, helping organizations adapt to a changing and increasingly challenging environment.
  • A monitored and controlled ecosystem to detect and respond to malfunctions, to security threats and to optimize operations, lightening the workload on staff.

How to structure networking governance

In the context of IT governance, it certainly deserves a dedicated chapter network governance which must contemplate a set of processes through which it is possible to guarantee an organization a effective and efficient use of IT resources in the networking field, in order to achieve their goals.

Network governance must also include the application of:

  • controls that help the company mitigate risk and create “guardrails”
  • measurements to check for potential problems

The main disciplines that emerge in the Network Governance are:

  • Compliance and security baseline
  • Vulnerability management
  • Identity management and access control
  • Acceleration, control and consistency in the deployment and change processes of network solutions
  • Optimization and efficiency of wired and wireless networks

Importantly, all of this needs to be done for IT resources in scope networking in any environment, both on-premise and in cloud and multi-cloud realities with a structured approach, consolidated and holistic.

Microsoft, also in this area, offers different tools and solutions that allow you to face the challenge of network governance in the Azure environment, to which it is necessary to support experience to implement consolidated and reliable processes.

Conclusions

In recent years, the adoption of hybrid architectures has attracted considerable interest from many companies, attracted by the possibilities offered and the benefits. In order to best create these environments and promote innovation, it is also essential to adapt the approach to the use of network resources and extend the governance processes of the IT environment to the networking area.

Azure IaaS and Azure Stack: announcements and updates (February 2023 – Weeks: 05 and 06)

This series of blog posts includes the most important announcements and major updates regarding Azure infrastructure as a service (IaaS) and Azure Stack, officialized by Microsoft in the last two weeks.

Azure

Compute

New planned datacenter region in Saudi Arabia (Saudi Arabia Central)

Microsoft will establish a new datacenter region in the country, offering organizations in Saudi Arabia local data residency and faster access to the cloud, delivering advanced data security and cloud solutions. The new datacenter region will also include Availability Zones, providing customers with high availability and additional tolerance to datacenter failures.

Azure Kubernetes Service introduces two pricing tiers: Free and Standard

To better communicate the benefits and use cases for the two control plane management options, today, Azure Kubernetes Service (AKS) is introducing two pricing tiers: Free tier and Standard tier. Previously, few customers were aware of the uptime SLA support, and many did not have the uptime SLA feature enabled for critical production workload. With the Standard tier, Microsoft hopes to help increase customer awareness and allow customers to gain the full benefits of the Standard tier for production workload to minimize disruption.

AKS’s unique Free tier allows you to only pay for the virtual machines, and associated storage and networking resources consumed, and you get the managed Kubernetes control plane for free. This allows you to deploy unlimited free test clusters to decide if AKS is right for your needs and allows you to configure and test your infrastructure set-up before running critical production workloads. The Free tier is recommended for clusters with less than 10 nodes and for experimenting, learning, and simple testing.

The new Standard tier is the recommended control plane management pricing option which comes with greater control plane resources, scalability and the existing uptime SLA support. Customers currently signed up for the uptime SLA support will automatically be moved to the Standard tier with no change in cost or action needed. Standard tier not only includes the uptime SLA, but it will also include additional features such as support for up to 5000 nodes per cluster and API server autoscaling.

Microsoft Azure Load Testing is now Generally Available

Azure Load Testing is a fully managed load-testing service that enables you to generate high-scale load, gain actionable insights, and ensure the resiliency of your applications and services. The service simulates traffic for your applications, regardless of where they’re hosted. Developers, testers, and quality assurance (QA) engineers can use it to optimize application performance, scalability, or capacity.

Trusted launch for Azure VMs in Azure for US Government regions

Trusted launch for Azure virtual machines is available in all Azure for US Government regions: US Gov Virginia, US Gov Arizona US Gov Texas, US DoD East, US DoD Central. Trusted launch for Azure VMs allows you to bolster the security posture of an Azure Virtual Machine.

Storage

Azure File Sync agent v16

The Azure File Sync agent v16 release is being flighted to servers which are configured to automatically update when a new version becomes available.

Improvements and issues that are fixed:

  • Improved Azure File Sync service availability: Azure File Sync is now a zone-redundant service which means an outage in a zone has limited impact while improving the service resiliency to minimize customer impact. To fully leverage this improvement, configure your storage accounts to use zone-redundant storage (ZRS) or Geo-zone redundant storage (GZRS) replication.
  • Sync upload performance improvements: this improvement will mainly benefit file share migrations (initial upload) and high churn events on the server in which a large number of files need to be uploaded.
  • Immediately run server change enumeration to detect files changes that were missed on the server.
  • Miscellaneous reliability and telemetry improvements for cloud tiering and sync.

To obtain and install this update, configure your Azure File Sync agent to automatically update when a new version becomes available or manually download the update from the Microsoft Update Catalog.

More information about this release:

  • This release is available for Windows Server 2012 R2, Windows Server 2016, Windows Server 2019 and Windows Server 2022 installations.
  • The agent version for this release is 16.0.0.0.
  • Installation instructions are documented in KB5013877.

Azure storage access tiers to append blobs and page blobs with blob type conversion

Azure Storage offers different access tiers so that you can store your blob data in the most cost-effective manner based on how it’s being used. Azure Storage access tiers include hot tier, cool tier, and archive tier. Azure Storage access tiers support only block blobs natively. When you need to save cost of storing append blobs or page blobs, you can convert them to block blobs then move them into the most cost-efficient tiers based on your access patterns. Blob type conversion along with tiering is now supported by PowerShell, CLI and AzCopy.

How to prepare your IT environment for new hybrid and multicloud scenarios

Many companies are engaged in the diffusion and adoption of applications that can work in different environments: on-premises, across multiple public clouds and at the edges. Such an approach requires adequate preparation of the corporate IT environment to ensure compliance and efficient management of large-scale server systems, of applications and data, while maintaining high agility. In this article, the main aspects to be taken into consideration for the adoption of hybrid and multicloud technologies are introduced, in order to best meet the business needs.

The reasons that lead to the adoption of hybrid and multicloud solutions

There are many reasons why customers choose to deploy their digital assets in hybrid and multicloud environments. Among the main ones we find:

  • Minimize or remove data lock-in from a single cloud provider
  • Presence of business units, subsidiary companies or acquired companies that have already made choices to adopt different cloud platforms
  • Different regulatory and data sovereignty requirements in different countries
  • Need to improve business continuity and disaster recovery by distributing workloads between two different cloud providers
  • Needs to maximize performance by allowing applications to run close to where users are

What aspects to consider?

There are several options for preparing an IT environment suitable for hosting hybrid and multicloud deployments, reason why before setting up your Azure environment or any other public cloud, it is important to identify how the cloud environment should support your scenario:

Figure 1 – Diagram showing how different customers distribute workloads between cloud providers

In the image above, each dark blue point represents a workload and each blue circle is a business process, supported by a separate environment. Depending on the cloud-mix, a different configuration of the Azure environment may be required:

  • Hybrid-first customer: most of the workloads remain in place, often in a combination of hosting models with traditional and hybrid resources. Some specific workloads are deployed on the edge, in Azure or other cloud service providers.
  • Azure-first customer: most of the workloads reside in Azure. However, some workloads remain local. Furthermore, certain strategic decisions lead some workloads to reside in the edges or in multicloud environments.
  • Multicloud-first customer: most workloads are hosted on a public cloud other than Azure, such as Amazon Web Services (AWS) or Google Cloud Platform (GCP). However, some strategic decisions have led some workloads to be placed in Azure or at the edges.

Depending on the hybrid and multicloud strategy you decide to undertake for applications and data, this will have to direct certain choices.

How to prepare the Azure environment

Microsoft Azure is an enterprise-grade cloud service provider and best able to support public environments, hybrid and multicloud.

To prepare an IT environment and make it effective for any hybrid and multicloud deployment, the following key aspects should be considered:

  • Network topology and connectivity
  • Governance
  • Security and compliance
  • Automation disciplines, development experiences and DevOps practices

When dealing with the issue of preparing your IT environment for new hybrid and multicloud scenarios, it is advisable to define the Azure "Landing Zone" which represents, in the cloud adoption journey, the point of arrival. It is an architecture designed to allow you to manage functional cloud environments, contemplating the following aspects:

  • Scalability
  • Security governance
  • Networking
  • Identity
  • Cost management
  • Monitoring

The architecture of the Landing Zone must be defined based on specific business and technical requirements. It is therefore necessary to evaluate the possible implementation options of the Landing Zone, thanks to which it will be possible to meet the deployment and operational needs of the cloud portfolio.

Figure 2 – Conceptual example of an Azure landing zone

What tools to use?

Cloud Adoption Framework

The Cloud Adoption Framework of Microsoft provides a rich set of documentation, guidelines for implementation, best practices and helpful tools to accelerate your cloud adoption journey. Among these best practices, which it is advisable to adopt and which it is advisable to specifically decline for the various customers according to their needs, there is one specific section concerning hybrid and multicloud environments. This section covers the different best practices that can help facilitate various cloud mixes, ranging from environments totally in Azure to environments where the infrastructure at the Microsoft public cloud is not present or is limited.

Azure Arc as an accelerator

Azure Arc consists of a set of different technologies and components that allow you to have a single control mechanism to manage and govern all your IT resources in a coherent way, wherever they are. Furthermore, with Azure Arc-enabled services, you have the flexibility to deploy fully managed Azure services anywhere, on-premises or in other public clouds.

Figure 3 –  Azure Arc overview

TheAzure Arc-enabled servers Landing Zone, present in the Cloud Adoption Framework, allows customers to increase security more easily, governance and compliance status of servers deployed outside of Azure. Together with Azure Arc, services like Microsoft Defender for Cloud, Azure Sentinel, Azure Monitor, Azure Policy and many others can be extended to all environments. For this reason Azure Arc should be considered as an accelerator for your Landing Zones.

Azure Arc Jumpstart has grown a lot and allows you to better evaluate Azure Arc, with over 90 automated scenarios, thousands of visitors per month and a very active open source community sharing their knowledge about Azure Arc. As part of Jumpstart, ArcBox was developed, an automated sandbox environment for everything related to Azure Arc, deployable to customers' Azure subscriptions. As an accelerator for the landing zone of Azure Arc-enabled servers it has been developed ArcBox per IT pro, which serves as a sandbox automation solution for this scenario, with services like Azure Policy, Azure Monitor, Microsoft Defender for Cloud, Microsoft Sentinel and more.

Figure 4 – Architecture of ArcBox per IT pro

Conclusions

The adoption of consistent operating practices across all cloud environments, associated with a common control plan, allows you to effectively address the challenges inherent in hybrid and multicloud strategies. To do this, Microsoft provides various tools and accelerators, one among which is Azure Arc which makes it easier for customers to increase security, the governance and compliance status of IT resources deployed outside of Azure.

Azure Management services: what's new in January 2023

The new year started with several announcements from Microsoft regarding news related to Azure management services. The monthly release of this summary allows you to have an overall overview of the main news of the month, in order to stay up to date on these topics and have the necessary references to conduct further exploration.

The following diagram shows the different areas related to management, which are covered in this series of articles:

Figure 1 – Management services in Azure overview

Monitor

Azure Monitor

Certificate the IT Service Management Connector (ITSMC) with ServiceNow Tokyo version (preview)

The IT Service Management Connector (ITSMC) is certified on the Tokyo version of ServiceNow. This connector provides a two-way connection between Azure Monitor and ServiceNow, useful to help you track and fix problems faster.

Govern

Azure Cost Management

Management of billing accounts for EA customers

For Enterprise Agreement customers (EA) “indirect” the ability to manage your billing accounts directly from Cost Management and Billing has been introduced. All relevant information regarding department, account and subscription are available directly from the Azure portal. Furthermore, from the same point it is possible to view the properties and manage the policies of the indirect EA enrollments.

Updates related toMicrosoft Cost Management

Microsoft is constantly looking for new methodologies to improve Microsoft Cost Management, the solution to provide greater visibility into where costs are accumulating in the cloud, identify and prevent incorrect spending patterns and optimize costs . Inthis article some of the latest improvements and updates regarding this solution are reported.

Azure Arc

Active Directory Connector for Arc-enabled SQL MI

Azure Arc-enabled data services introduced Active Directory support (AD) for the management of Identity and Access Management (IAM). Indeed, the Arc-enabled SQL Managed instance can use an Active Directory domain (AD) existing on-premises for authentication. To facilitate this, Azure Arc-enabled data services introduce a new Custom Resource Definition (CRD) native Kubernetes called Active Directory Connector. This provides Azure Arc-enabled SQL Managed Instances running on the same data controller the ability to perform Active Directory authentication.

View SQL Server databases using Azure Arc (preview)

Today, customers and partners manage a large number of databases. For each of these databases, it is essential to be able to create an accurate mapping of the configurations. This may be for inventory or reporting purposes. Centralizing database inventory in Azure using Azure Arc allows you to create a unified view of all your databases in one place, regardless of the infrastructure in which they are located: in Azure, in the data center, at edge sites or even other clouds.

Secure

Microsoft Defender for Cloud

New features, bug fixes and deprecated features of Microsoft Defender for Cloud

Microsoft Defender for Cloud development is constantly evolving and improvements are being made on an ongoing basis. To stay up to date on the latest developments, Microsoft updates this page, this provides information about new features, bug fixes and deprecated features. In particular, this month the main news concern:

  • the endpoint protection component (Microsoft Defender for Endpoint) it is now accessible on the Settings and monitors page;
  • new version of the recommendation to find missing system updates;
  • cleanup of deleted Azure Arc machines in linked AWS and GCP accounts.

Protect

Azure Backup

Updates and improvements regarding SAP HANA

The following updates and improvements have been made recently to Azure Backup for SAP HANA, the certified solution Backint for protecting SAP HANA databases residing in Azure virtual machines:

  • Long-term retention for backups “adhoc”: it is now possible to provide customized retention for backups that occur on demand, outside the scheduled policies.
  • Partial restore-as-files: Azure Backup for HANA allows recovery points to be restored as a file. If you download the entire chain for one recovery point and want to repeat the operation for another adjacent recovery point, you don't need to download the entire chain again. It is also possible to restore only the files you want.
  • Integration with native clients and with other tools: previously, for certain scenarios, it was necessary to deactivate backint before the request and reactivate it afterwards, thereby increasing the RPO. With the improvements introduced, these additional steps are no longer necessary and it will be sufficient to activate the requests from the native clients or from the other tools used.

Azure Site Recovery

Ability to use Azure Backup Center for ASR monitor

Azure Backup Center is the point of reference for those who use the native backup features of the Azure platform and allows them to govern, to monitor, manage and analyze backup tasks. Microsoft has extended its capabilities by including monitor capabilities for Azure Site Recovery, which:

  • Viewing the inventory of replicated items, from a single view, for all vaults.
  • Consultation through a control panel of all the replication jobs.

Azure Backup Center supports ASR replication scenarios involving Azure virtual machines, VMware and physical machines.

Migrate

Azure Migrate

New Azure Migrate releases and features

Azure Migrate is the service in Azure that includes a large portfolio of tools that you can use, through a guided experience, to address effectively the most common migration scenarios. To stay up-to-date on the latest developments in the solution, please consult this page, that provides information about new releases and features. In particular, this month the main news concern:

  • Possibility to plan savings with the ASP savings option (Azure Savings Plan for compute) with the Azure Migrate business case and assessment.
  • Support for exporting the business case report to an .xlsx workbook from the portal.

Evaluation of Azure

To test for free and evaluate the services provided by Azure you can access this page.

Azure IaaS and Azure Stack: announcements and updates (January 2023 – Weeks: 03 and 04)

This series of blog posts includes the most important announcements and major updates regarding Azure infrastructure as a service (IaaS) and Azure Stack, officialized by Microsoft in the last two weeks.

Azure

Compute

Classic VM retirement: extending retirement date to September 1st 2023

Microsoft is providing an extended migration period for IaaS VMs from Azure Service Manager to Azure Resource Manager. To avoid service disruption, plan and migrate IaaS VMs from Azure Service Manager to Resource Manager 1 September 2023. There are multiple steps to this transition, so we recommend that you plan your migration promptly to avoid potential system interruption.

Networking

Application security groups support for private endpoints

Private endpoint support for application security groups (ASGs) is now available. This feature enhancement will allow you to add granular controls on top of existing network security group (NSG) rules by attaching an ASG to the private endpoint network interface. This will increase segregation within your subnets without losing security rules. In order to leverage this feature, you will need to set a specific subnet level property, called PrivateEndpointNetworkPolicies, to enabled on the subnet containing private endpoint resources.

Storage

5 GB Put Blob

Azure Storage is announcing the general availability of 5 GB Put Blob. This allows you to upload nearly 20x the previous limit of Put Blob uploads while increasing the maximum size of Put Blob from 256 MiB to 5000 MiB.

Mount Azure Storage as a local share in App Service Windows Code

Mounting Azure Storage File share as a network share in Windows code (non-container) in App Service is now available.

Incremental snapshots for Ultra Disk Storage (preview)

The preview of incremental snapshots for Ultra Disk in the Sweden Central and US West 3 Azure region is available. This new capability is particularly important to customers who want to create a backup copy of their data stored on disks to recover from accidental deletes, or to have a last line of defense against ransomware attacks, or to ensure business continuity. You can now create incremental snapshots for Ultra Disk on Standard HDD. Additionally, snapshot resources can be used to store incremental backups of your disk, create or recover to new disks, or download snapshots to on-premises locations.

Azure Stack

Azure Stack HCI

Software Defined Networking (SDN) with WAC v2211

In this article there are all new features and improvements for SDN in Windows Admin Center 2211 (WAC) for Azure Stack HCI.

The calculation of the energy consumption and environmental impact of Microsoft's public cloud

After the Paris Agreement, with increased attention on climate change and measures taken by governments to reduce carbon emissions, the environmental impact of IT systems is increasingly in the spotlight. Several studies have shown that the cloud also offers significant benefits in terms of sustainability and provides companies with the possibility of reducing the environmental impact of IT services, thus contributing to a more sustainable future. To evaluate the real impact, it is advisable to apply measurements and controls. This article describes the methodology designed to calculate the carbon emissions associated with the use of Microsoft Azure resources.

Microsoft provides tools to monitor and manage the environmental impact of carbon emissions, based on the methodology described in this article, which is constantly evolving and improving. Such tools, specific to the Azure cloud, allow to:

  • Get the visibility you need to promote sustainability, taking into account both emissions and carbon use.
  • Simplify data collection and emissions calculations.
  • Analyze and report more efficiently the environmental impact and progress of a company in terms of sustainability.

This methodology used by Microsoft is constantly updated to include science-based approaches as they become available and relevant for assessing the carbon emissions associated with the Azure cloud.

Standards used for calculation

Microsoft shares its greenhouse gas emissions (GHG) into three categories (scope), sticking to Greenhouse Gas Protocol, a globally recognized standard for the methodology for calculating and reporting greenhouse gas emissions (GHG).

Scope 1: direct emissions – emissions deriving from combustion and industrial processes

Greenhouse gas emissions in this category include emissions from the combustion of diesel and emissions from the use of refrigerants for cooling data centers.

Scope 2: indirect emissions – emissions resulting from electricity consumption, heat or steam

Greenhouse gas emissions in this category include emissions from the consumption of electricity used to power Microsoft data centers.

Scope 3: other indirect emissions – the emissions generated during the production phase and at the end of the product life cycle

Greenhouse gas emissions include emissions from the extraction of raw materials, from component assembly and end-of-life management of hardware devices (for example: recycling, landfill or compost), such as servers and network equipment, used in Microsoft data centers.

Figure 1 – Examples of types of scope carbon emissions 1, 2 and 3 in the Microsoft cloud

In this context, it should be borne in mind that the 2020 Microsoft has reaffirmed its commitment to integrating sustainability into all of its businesses. In fact,, announced an ambitious goal and plan to reduce and ultimately eliminate carbon emissions. Under this plan, Microsoft has set itself the goal of becoming a company “carbon neutral” by 2030, and is adopting various strategies to reduce its carbon emissions, including the purchase of renewable energy sources, optimizing the energy efficiency of its data centers and supporting the transition to a low-carbon economy.

Normative

Microsoft bases its calculation methodology also relying on widely accepted ISO standards in the industry:

  • Carbon emissions related to materials are based on ISO standard 14067:2018 (Greenhouse gases – Product carbon footprint – Quantification requirements and guidelines).
  • Operational emissions are based on ISO standard 14064-1:2006 (Greenhouse gases – Part 1: Organization-wide specifications and guidelines for quantifying and reporting GHG emissions and removals).
  • Verification and validation are based on the ISO standard 14064-3:2006 (Greenhouse gases – Part 3: Specifications with guidance on validating and verifying greenhouse gas claims).

Calculation methodologies

Scope 1 and 2

Greenhouse gas emissions related to the use of electricity for scopes 1 and 2 are usually divided into categories such as Storage, Compute and Network. The quantification of the emissions of these scopes is based on the time of use of the individual categories. The methodology used to calculate emissions in Scope 1 and 2 is generally based on a lifecycle analysis present in a Microsoft study, available at this address. This methodology for the Scope 2 includes calculation of energy impact and carbon emissions for each specific data center, considering factors such as data center and server efficiency, the emission factors, renewable energy purchases and infrastructure energy usage over time.

Scope 3

Calculation of emissions relating to the Scope 3 is summarized in the following figure:

Figure 2 – Methodology for calculating emissions relating to the Scope 3

It starts with the assessment of the life cycle of the materials used in the data center infrastructure and the related carbon emissions are calculated. This sum is then segmented based on customer usage of each data center.

This methodology for emissions related to the Scope 3 calculates the energy and carbon footprint for each data center over time, taking into consideration the following:

  • The most common materials used for the construction of the IT infrastructure used in data centers.
  • The main components that make up the cloud infrastructure.
  • The complete list of all assets in Microsoft data centers.
  • Carbon factors for cloud infrastructure at all stages of the lifecycle (extraction of raw materials, component assembly, use and disposal at the end of the life cycle).

Validation of the Microsoft methodology for scope 3 is published at this link.

Common definitions

This section contains definitions of the most frequently used terms relating to the impact of emissions:

  • mtCO2e: is the unit of measurement used to express the impact of greenhouse gas emissions on the global greenhouse effect. It takes into account not only carbon dioxide emissions (CO2), but also of other greenhouse gases such as methane (CH4), nitrous oxide (N2O) and fluorinated gases (F-gases). mtCO2e is used to measure global greenhouse gas emissions and to set emissions reduction targets.
  • Carbon emissions (mtCO2e) from Azure: carbon emissions (mtCO2e) for the Azure cloud refer to the amount of greenhouse gases, mainly carbon dioxide (CO2), emitted into the atmosphere due to the use of Microsoft Azure cloud computing services. This value includes Microsoft Scopes (1, 2 it's the 3).
  • Carbon intensity (mtCO2e/usage): the carbon intensity index provides a ratio between carbon dioxide emissions and another variable. For Green SKU, this is the total carbon dioxide equivalent emissions per hours of use, measured in mtCO2e/hour. The purpose of this index is to provide visibility into carbon emissions related to the use of Azure services.
  • Carbon emissions expected at the end of the year (mtCO2e): Projected end-of-year cumulative carbon emissions allocation based on current year's cloud resource usage projection and previous year's trends.

Conclusions

To identify the benefits to the IT environment of deploying applications on Azure, it is important to educate customers about the environmental impact of their IT assets and provide them with the tools to govern that impact. This must be done with the intention of improving, setting specific and realistic sustainability objectives. Such an approach benefits both the business and society.

Azure IaaS and Azure Stack: announcements and updates (January 2023 – Weeks: 01 and 02)

This series of blog posts includes the most important announcements and major updates regarding Azure infrastructure as a service (IaaS) and Azure Stack, officialized by Microsoft in the last two weeks.

Azure

Storage

Azure Ultra Disk Storage in Switzerland North and Korea South

Azure Ultra Disk Storage is now available in one zone in Switzerland North and with Regional VMs in Korea South. Azure Ultra Disk Storage offers high throughput, high IOPS, and consistent low latency disk storage for Azure Virtual Machines (VMs). Ultra Disk Storage is well-suited for data-intensive workloads such as SAP HANA, top-tier databases, and transaction-heavy workloads.

Azure Active Directory authentication for exporting and importing Managed Disks

Azure already supports disk import and export locking only from a trusted Azure Virtual Network (VNET) using Azure Private Link. For greater security, the integration with Azure Active Directory (AD) to export and import data to Azure Managed Disks is available. This feature enables the system to validate the identity of the requesting user in Azure AD and verify that the user has the required permissions to export and import that disk.

Migrating to Azure: from motivations to a successful business case

Moving to the cloud can definitely lead to cost savings, more effective use of resources and improved performance. However, the question to ask yourself before tackling a migration path is: “why move to the cloud?”. The answer to this question is not trivial and often coincides with: “our board of directors (or the CIO) told us to move to the cloud”. In the face of a response of this type it is appropriate to turn on an alarm bell as it could be more difficult to achieve the expected results. This article discusses some of the reasons behind migrating to the cloud that can help drive more successful business outcomes, and what elements and tools to consider to support building a complete business case..

Motivations for moving to the cloud

The motivations that can drive the business transformations supported by the adoption of the cloud can be different. To help generate ideas about what motivations may be relevant, I report the following table, where there is a subdivision between the main classifications:

Critical Business Events Migration Innovation
Data center exit

Mergers, acquisitions or divestments

Reduction of capital expenses

End of support for mission critical technologies

Respond to regulatory compliance changes

New data sovereignty requirements

Reduce outages and improve the stability of your IT environment

Report and manage the environmental impact of your business

Cost savings

Reduction of technical or vendor complexity

Optimization of internal operations

Increase business agility

Preparation for new technical capabilities

Scalability to meet market demands

Scalability to meet geographic needs

Integration of a complex IT portfolio

Preparation for new technical capabilities

Creation of new technical capabilities

Scalability to meet market demands

Scalability to meet geographic needs

Improved customer experience and engagement

Processing of products or services

Market disruption with new products or services

Democratization and/or self-service environments

Table 1 – Top reasons for adopting the cloud

It is likely that different motivations for cloud adoption will apply at the same time and fall into different classifications..

To guide the development of your cloud adoption strategy it is recommended to use the predominant classification between: critical business events, migration and innovation. These motivations must then be shared and discussed with the stakeholders, corporate executives and leaders. In this way it is possible to favor the successful adoption of cloud solutions within the company.

How to accelerate migration

Often the migration it is the first step that leads to the adoption of cloud solutions. In this regard it is useful to follow the "Migrate" methodology defined in the Cloud Adoption Framework, which outlines the strategy to perform a cloud migration.

This guide, after aligning stakeholders on motivations and expected business outcomes advises clients to establish the right partnerships to get the necessary support throughout the entire migration process.

The next step involves data collection and an analysis of the assets and workloads to be migrated. This step must lead to the development of a business case regarding cloud migration with the aim of ensuring that all stakeholders are aligned on a simple question: “based on available data, cloud adoption is a wise business decision?".

If so, you can continue with the next steps detailed in the guide and which provide:

  • Creating a migration plan
  • The preparation of the necessary skills
  • The activation and configuration of the Landing Zone
  • The migration of the first workloads to Azure
  • The implementation of cloud governance and operations management

Creating Business Cases: key elements, tools and calculators

A business case provides an overall view on the technical and financial timing of the analyzed environment. The development of a business case must necessarily include the creation of a financial plan that takes into account the technical aspects and is in line with business results.

There are several key components to consider when making a business case, among these we find:

  • Scope of the environment
  • Basic financial data
  • On-premises cost scenario: needs to predict what on-premises costs will be if you don't migrate to the cloud.
  • Azure cost scenario: cost forecast in case of cloud migration.
  • Migration Timeline

A business case is not just a momentary view, but it must be a plan covering a defined time period. As a last step, it is useful to compare the cloud environment with an on-premises scenario or with the status quo, so you can evaluate the data benefits of migrating to the cloud.

To support the preparation of a business case for cloud migration you need to use tools and calculators. Microsoft provides several, described in the following paragraphs.

Azure Migrate

Azure Migrate is the service in Azure that includes a large portfolio of tools that you can use, through a guided experience, to address effectively the most common migration scenarios.

Azure Migrate recently introduced the feature for creating Business case which helps build propositions to understand how Azure can drive the most value. In fact,, this solution allows you to evaluate the return on investment regarding the migration of server systems to Azure, of SQL Server deployments and ASP.NET web applications running in a VMware environment. The business case can be easily created and can provide useful elements to evaluate:

  • Your on-premises total cost of ownership compared to Azure.
  • Information based on resource usage, to identify ideal servers and workloads for the cloud and recommendations for right sizing in Azure.
  • The benefits for migration and modernization, including the end of support for Windows and SQL versions.
  • The long-term savings of moving from a capital expenditure model to an operating expenditure model, paying only for what you use.

Azure Total Cost of Ownership (TCO) Calculator

The Azure TCO calculator can be used to estimate the cost savings that can be achieved by migrating workloads to Azure. Entering the details of the on-premise infrastructure (server, database, storage and networking, as well as the licensing assumptions and costs) the calculator is able to match Azure services by showing a high level TCO comparison. However, the results of the Azure TCO calculator should be considered carefully, as by adopting Azure optimization measures can be taken and therefore it may not be exhaustive.

Azure Pricing Calculator

The Azure Pricing Calculator can be used to estimate monthly costs for Azure solutions.

Azure VM cost estimator

This is a Power BI model that helps you estimate Azure savings, compared to the pay-as-you-go rate, adopting the offers and benefits of Azure for virtual machines, such as Azure Hybrid Benefit and the reserved instances.

Conclusions

Identifying the motivations, conducting an assessment and building a business case are essential elements to build a functional cloud adoption strategy and to adopt a successful migration plan.

Azure IaaS and Azure Stack: announcements and updates (December 2022 – Weeks: 51 and 52)

This series of blog posts includes the most important announcements and major updates regarding Azure infrastructure as a service (IaaS) and Azure Stack, officialized by Microsoft in the last two weeks.

During these two weeks of holidays, there were no notable news related to these areas.

We look forward to 2023 for lots of news!

I wish everyone a happy 2023!

Azure Management services: what's new in December 2022

In December, several news regarding Azure management were announced by Microsoft services. The release of this summary, which occurs on a monthly basis, want to provide an overview of the main news of the month, in order to stay updated on these topics and have the necessary references to conduct further investigations.

The following diagram shows the different areas related to management, which are covered in this series of articles:

Figure 1 – Management services in Azure overview

Monitor

Azure Monitor

Azure Monitor Agent: IIS logs and custom logs

The Azure Monitor agent allows you to collect text files and IIS logs and merge them into a Log Analytics workspace. In this regard, a new feature has been introduced to allow the collection of text logs generated in the application environment, exactly as it happens for Internet Information Service logs (IIS).

Azure Monitor Logs: custom log API and ingestion-time transformation

A new set of features is now available in Azure Monitor that allows you to fully customize the shape of the data that flows into your workspace, plus a new API for custom data merging. Thanks to these new features, it is possible to envisage customized transformations to the data at the time of ingestion. These transformations can be used to set up the extraction of fields during ingestion, obfuscate sensitive data, proceed to remove unnecessary fields or to delete complete events (useful for example to contain costs). Furthermore, it is possible to completely customize the data sent to the new API for custom logs. As well as being able to specify a transformation on the data sent to the new API, you can also explicitly define the schema of your custom table (including dynamic data structures) and leverage AAD authentication and ARM RBAC management.

Configure

Azure Automation

Extension for the Hybrid Runbook Worker

The User Hybrid Worker extension was announced in Azure Automation, which is based on the virtual machine extensions framework and offers an integrated installation experience. There is no dependency on the Log Analytics agent and workspace, and authentication is via System-assigned managed identities, eliminating the need to manage certificates. Furthermore, ensures automatic minor version upgrades by default and simplifies small-scale management of Hybrid Workers through the Azure portal, cmdlet PowerShell, Azure CLI, Bicep, ARM templates and the REST API.

Govern

Azure Cost Management

Use tag inheritance for cost management (preview)

Tag inheritance was announced in a public preview, which allows you to automatically apply subscription and resource group tags to child resources. This mechanism simplifies cost management pipelines.

Updates related toMicrosoft Cost Management

Microsoft is constantly looking for new methodologies to improve Microsoft Cost Management, the solution to provide greater visibility into where costs are accumulating in the cloud, identify and prevent incorrect spending patterns and optimize costs . Inthis article the main improvements and updates of this solution are reported for the year 2022.

Azure Arc

Azure Arc enabled Azure Container Apps (preview)

Azure Container Apps enables developers to quickly build and deploy microservices and containerized applications. Deploying an Arc extension on Azure Arc enabled Kubernetes cluster, IT administrators gain control of the underlying hardware and environment, enabling high productivity of Azure PaaS services within a hybrid environment. The cluster can be on-premise or hosted in a third-party cloud. This approach allows developers to leverage the functionality and productivity of Azure Container Apps anywhere, not only in Azure environment. While, IT administrators can maintain corporate compliance by hosting applications in hybrid environments.

Server Azure Arc enabled in Azure China

Azure Arc-enabled servers are now also operable in two regions of Azure China: Est China 2 and North China 2.

Secure

Microsoft Defender for Cloud

New features, bug fixes and deprecated features of Microsoft Defender for Cloud

Microsoft Defender for Cloud development is constantly evolving and improvements are being made on an ongoing basis. To stay up to date on the latest developments, Microsoft updates this page, this provides information about new features, bug fixes and deprecated features.

Protect

Azure Backup

Recovery of Azure virtual machines Cross Zonal

Azure Backup exploits the potential of Zonal Redundant Storage (ZRS), which stores three replicas of backup data in different Availability Zones, synchronously. This allows recovery points stored in the Recovery Services Vault to be used with ZRS storage even if the backup data in one of the Availability Zones is unavailable, ensuring data availability within a region.

The Cross Zonal Restore option can be considered when:

  • Zone-wide availability of backup data is critical, and backup data downtime is unacceptable. This allows you to restore Azure virtual machines and disks to any zone of your choice in the same region.
  • Backup data resilience is needed along with data residency.

Azure Kubernetes Service (AKS) Backup (private preview)

For the Azure Backup service, the private preview of AKS Backup was announced. Using this feature it is possible:

  • Back up and restore containerized applications, both stateless and stateful, running on AKS clusters
  • Back up and restore data stored on persistent volumes attached to clusters.
  • Perform backup orchestration and management from the Backup Center.

Azure Site Recovery

Increased the churn limit (preview)

Azure Site Recovery (ASR) increased the data churn limit by approx 2,5 times, bringing it to 50 MB/s per disk. This way you can configure disaster recovery (DR) for Azure VMs with a data churn of up to 100 MB/s. This allows you to enable DR for IO intensive workloads. This feature is only available for Azure-to-Azure replication scenarios.

New Update Rollup

For Azure Site Recovery was released theUpdate Rollup 65 that solves several issues and introduces some improvements. The details and the procedure to follow for the installation can be found in the specific KB.

Migrate

Azure Migrate

New Azure Migrate releases and features

Azure Migrate is the service in Azure that includes a large portfolio of tools that you can use, through a guided experience, to address effectively the most common migration scenarios. To stay up-to-date on the latest developments in the solution, please consult this page, that provides information about new releases and features. The main news of this month are described in detail in the following paragraphs.

Software inventory and agentless dependency analysis

Azure Migrate agentless software inventory and dependency analysis is now available for Hyper-V VMs, for bare-metal servers and for servers running on other public clouds such as AWS and GCP. It is therefore possible to inventory the applications, the roles and features installed on those systems. Furthermore, you can run dependency analysis on discovered Windows and Linux servers without installing any agents. Thanks to these features it is possible to build migration plans to Azure more effectively, going to group the servers related to each other.

Building a business case with Azure Migrate (preview)

Azure Migrate's business case feature helps you build business propositions to understand how Azure can drive the most value. In fact,, this solution allows you to understand the return on investment regarding the migration of server systems to Azure, of SQL Server deployments and ASP.NET web applications running in the VMware environment . The business case can be created with just a few clicks and can help you understand:

  • Total cost of ownership on-premises vs Azure and annual cash flow.
  • Resource utilization-based insights to identify ideal servers and workloads for the cloud and recommendations for right sizing in Azure.
  • Benefits for migration and modernization, including the end of support for Windows and SQL versions.
  • Long-term savings by moving from a capital expenditure model to an operating expenditure model, paying only for what you use.

Evaluation of Azure

To test for free and evaluate the services provided by Azure you can access this page.