What are the standards we should follow when we move on to Public Cloud?
We often get this questions during the Public Cloud conversations with different stakeholders. I would say ‘Change’ should be standard. Ahh…What?
Be ready to accept a Change
Be ready to execute a Change.
Be ready to prepare for future Changes.
The change should start from the minds of people. That means develop a mindset to accept a change is the first thing in the transformation journey. It is the first barrier as we know that each of us are comfortable to continue with existing system because change will bring interruptions, nobody like interruptions.
To execute changes successfully we should prepare our ecosystem to support that. We need to have people, tools, process etc updated to ensure that it compliments the change.
Continuous change requires continuous preparation for future changes. This require lot of efforts with a forward thinking and creating clear roadmap as well.
Confidence makes you a good presenter. So, what gets you confidence? It is information, the more you know… the more you build the confidence.
Let me write today my story how I developed presentation skills. Now a days I do lot of presentations to clients across the world. It is mostly around technical stuff predominately around Azure which is obvious as I am an Azure Architect. If I look few year back, I used to get nervous about it. Over the time especially after I moved in to DXC Azure team, I have been given lot of opportunity present in front of internal as well as client senior leadership. Initial days I struggled bit as I was in the hands of fear which we can call it as stage fright.
We often hear “We want to achieve Cloud Speed” when we talk about deployment and management of Public Cloud. What does that mean? A VM can be deployed in few minutes in Public Cloud so the question is “why do you take longer time to start managing that?”. It is an interesting question, isn’t it? How many of you deals with such questions on daily basis. As far as I understand, it isn’t easy task for any company or service provider achieve it and sustain that. It require lot of preparation to gain that speed. Because, it is not a just about deployment but also taking care many things starting from monitoring, management, security, billing etc.
Let’s take look at some of the key areas we need to focus. Vrooooomm……
Azure App Service supports connecting to Application (App) Server and Database (DB) Server hosted in Azure VM or on-premise server using vNET integration that is part of App Service Plan(ASP). You get such scenarios often if you are migrating your applications landscape to Azure from on-premise. You will have different scenarios like shared DBs or may be DB team is not ready yet to DBs to PaaS, or there is no PaaS DB service available for the like of DB2, Oracle database services. App Service vNET integration is a useful feature if you do not chose to go with App Service Environment (ASE).
When you configure the vNET integration, your WebApp will get private IP from the vNET. App Service will be able to communicate with a VM in the vNET or another vNET peered in the same region, or a server on-premises over express route or s2s VPN.
Much awaited, Badly wanted but was missing from long time. It is now Public Preview…Thanks Microsoft for adding it now.
One of the reasons why I was staying away from recommending customers the Azure Bastion primarily because of this missing feature. I think it is the time to change my mind and recommend Azure Bastion as it save lots of dollars now. Because we are moving away from per vNET deployment model to per AAD tenant or as per customer requirement.
Let’s focus on Azure SQL Database reservation today. If you have not read my other blog posts on VM, please read it here. You can save cost on SQL costs with Azure reserved capacity. It covers both for Azure SQL database and SQL Managed Instance. You must be owner of the subscription or EA admin if it is EA and you must a admin agent or sale agent to buy Azure Blob Storage reserved capacity.
Importantly, a reservation covers only the compute charges of the instances in the subscription, it does not cover for software, networking, or storage charges associated with the services. You may note the below points when you think about the reserving the Azure SQL database or SQL Managed Instance. It provides good amount of cost saving along with Azure Hybrid benefits.
We talked about Azure Reservation for VM in my previous blog post. If you have not read it already, I suggest you to read it.
The VM RI discounts applicable only to the VM infrastructure cost but it does not include disks or storage used. Azure Disk Storage reservations combined with Azure reserved VM instances help you reduce total VM cost.
The common rules are applicable here as well “Use it or Lose it”,
Currently, Azure Disk Storage reservations are available only for selected Azure premium SSD SKUs. But it doesn’t apply to unmanaged, ultra disks and page blob consumption.
The reservation for disk is not based on capacity but it is based on total number of disks per SKU. That means you make reservation consumption based on the units of the disk SKUs instead of the provided size. Example, you cannot use P40 reservation for 2 P30 disks. It is does not have instance flexibility like in VM reservations.
Let me try bring some insight on Azure storage reservations. If you have not read my previous blog posts on VM reserved instance, please read it here. You can save on storage cost for blob data with Azure Storage reserved capacity. It covers both for block blobs and Data Lake Gen2. You must be owner of the subscription or EA admin if it is EA and you must a admin agent or sale agent to buy Azure Blob Storage reserved capacity. I tried to bring information from different MS article to one place with bullet points to help you on this.
Importantly, a reservation covers only the data stored in the subscription, it does not cover other actions like early deletion, operations, bandwidth and data transfer charges. You may note the below points when you think about the reserving the Azure storage.
There lived a monster in a tiny village. Several Men tried to fight the monster. When they attack the Monster with Swords, it grabbed the weapon and pulled out another one twice as sharp, large and attacked them back. It continued regardless of any methods they tried to fight the monster. However, one day a little boy went along with others and the boy offered him an apple. The monster grabbed it and returned two delicious apples twice as red and large as the apple the boy had offered. Soon, the villagers realized that the monster was not a curse but a blessing. I read this story from the book The secret of leadership by Prakash Iyer.
Work from home is nothing new for us, but amid Covid 19 outbreak we are forced to WFH now. It is our responsibility to exercise social distancing to break the chain and at the same time we need to support our company and our customers.
I have been doing the WFH from more than a month now also cancelled all time official and personal trips. I had already set up my home office since I used to WFH few days i n a week. Often, I tent to work long hours so it was important to make sure that I have a comfortable working experience at my home office.
Occasionally, we work from home, but it is different situation now. I think this situation will continue for another two or three months if I analyse the situation correctly. So, it is important that you need to have necessary facilities at home for you to work 8 or more hours continuously. Let me list down some of the important tips to get you better working environment at home.
We hear lot about the automation now a days. Should we automate anything and everything? Yes, most of it. However, we must understand why do we need to bring automation. As far as I understand, we need to automate if it relates to below mentioned points at least.
Is it a repeated task?
Does it save time during deployment (like saving down time)
Does it avoid human error?
Does it bring standardization in repeated tasks?
I have seen engineers trying to codify everything which might waste hours of time or days, but it could have done in few minutes vs hours vs days. I have done such things in the past, but it is for learning though.
I am writing this blog to explain different methods for connecting Azure Database for PostgreSQL server those who knows Azure but never worked in PostgreSQL. So, I will not be explaining how to manage databases but managing the PaaS from the portal. You may review details of this PaaS service from the Microsoft documentation to understand details like the SKUs, different plan, generations etc. I am just sharing some of my experience when I tried to deal with this for the first time.
Before we get into the connectivity methods, let’s talk about deploying PostgreSQL with the General-Purpose performance configuration which I think it is important . The available Generation is 5 as Generation 4 is not available for the deployment.
We can use either ARM templates or portal to perform the deployment. There are few ARM templates for PostgreSQL available in the Gibhub.
There are few things you need to remember when you input the parameters using ARM. The below things might confuse you.
I am starting my 2020 year with the review of Azure Bastion Host.
It was welcoming to see Microsoft introducing the Azure Bastion Host. It allows you to connect to your VMs without having public IP configured on the VM. I had reviewed the preview of this in my blog last year. I suggest you read the my blog and other Microsoft articles to get the details of that as I am not explaining that in this blog.
I noticed Microsoft have added below features with GA.
Extended to few more regions
Integrated with Log analytics that provides audit logs.
A small help can bring smile on someone’s face. Doesn’t matter how big or small, what matters is we do it. How can we make it part of our life style?
Let’s decide that we will donate something when we achieve something in our life. Example, when you get a salary increment or bonus or it could be a certification or a degree.
Personally, I make a wish before I do things in my life that I will donate an amount to those who really need that. It may be a trek, travel, exam, promotion, new job or anything else. I do not know if there is someone hearing this or not, but that gives me lot of happiness and peace. I suggest you try this at least one time, I am sure this would bring a smile on someone and you as well.
It is not always the salesman’s responsibility to sell. I think the delivery team is well placed to sell once the contract is signed. I have been working as architect from quite some time, so I think it is better I talk about this as a Architect.
I think an Architect is naturally placed to build a relationship with customer and grow business. He is probably the one individual start working with client early once the deal is signed. He will be working with key technical contacts of the client including CTO, CIO, IT directors etc who make or influence the IT decisions. So, the Architect is key person and instrumental in driving clients to find their best solutions.
It was almost 2 years of break from the MS certifications before I tried AZ-500 early this week, it was an interesting one. It was the first MS certification I have ever appeared with hands on lab though it was bit of a surprise. I thought of sharing my experience on exam which might be helpful if you are trying get this certification.
The exam is total 3 and half hours with 3 hours of exam time. I suggest you to go through exam skills outline before you starting the preparation. I started with course in the Linux Academy. I found it is especially good for Azure Active Directory as it covers all the features of AAD that is part of P2. The course covers almost all the subject required for the exam for us to start preparing for the exam. However, don’t stop it there… we need to deep dive in to each subject with MS documentations. Importantly, you need to do lot of hands-on for each topic described in the exam skills outline.
Every failure is lot of learning and learning is key for the
success. Question is can we not learn
without the failure, may be. But failure is not bad if you don’t stop there.
You have seen this subject already in many articles on this I do not want to talk about it but I would like to talk about building a culture in the organization to encourage people to take risks without fear of failure which result in loosing your job . Leadership in the organization should be able bring the innovation to accelerate the growth.
If there is a fear about the losing the job due to failure, they will be scared to execute a change. Instead, they try to stick to safest route to save their own ass. It is the dangerous thing to happen to an organization because that organization will die due to the lack of innovation. We have read about the companies like Kodak, Blackberry, Nokia etc. those who forgot/late to make much needed changes.
Do you see your ideas are not accepted by your management because you are not so experienced as others? If yes, I think it is not right. Experience always play good role in taking decisions and selecting a solution which is right but that should not be limiting in accepting ideas from others.
I just read this below caption from a restaurant in Chennai.
It was an interesting caption.
It is important for every team to build a culture to encourage everybody in the team to come up with ideas and award them for the good ideas. It would help the team to be more innovative, innovation is key for success.
What is key for this? Good leadership team who does not think ‘Egg or Chicken First’
I was exploring some of the information on each Azure regions but I could not find a single page with all the information listed. So I thought of creating a table and share it with our Cloud Community. The research for the blog made me to realize that some of the interesting facts about Azure regions. I think this will help the Architects who want to quickly do the fact checks for designing their Azure solutions.
I will try my best to update the table to ensure that you get latest information. At the same time you can also verify this information from the URLs provided bottom of this blog.
What you get from this blog post:
Geography, Azure Region, Availability Zones (AZ) in that region, Location and its Paired Region(s) in single table.
You can get all the resources in that Geography by clicking on each Geography in the first column. I have selected all the azure native services available in that area for you. So you get the services for all the regions in a single click.
I have marked Featured Regions in bold and you get details about that region by clicking on the region column where applicable.
Some facts about Azure regions:
Only South India, South Central US and US Gov Texas Azure regions are paired with more than one regions. But condition apply.
West India’s paired region is South India, it is paired only in one direction.
But South India’s secondary region is only Central India.
Brazil South’s secondary region is South Central US but its secondary region is not Brazil.
Interestingly, US Gov Virginia’s secondary region is US Gov Texas and then you see US Gov Arizona is the secondary region of US Texas.
Only 6 Azure regions have paired regions in different countries.
South Central US
Only three Azure Regions are located undisclosed locations in US.
Switzerland North is available only for selected customers now. You need to contact support for creating the resources
Switzerland West is reserved for customers requiring in country disaster recovery. You may need to contact Azure Support for creating the resources.
The below is nice illustration of mapping security services from different Cloud Service Providers. I see Azure is clearly winning as you hardly see third party solution mapped in their security product list. It does not tell you which service serve better for the multiple customer use cases. It is interesting to see Alibaba is catching up with list of products.
“I am moving to the cloud. I can finish it off in a snap of time, everything is available there”. I hear a lot from the customers and leaderships. I cannot blame them because it is what every Cloud Service Providers are saying.
I tell them, “ only computing moved up in the sky and we call it Public Cloud”. If you observe it from the ground zero it is true that in many cases because we have not changed our mind set , and we are not prepared enough yet. To move our workload to public cloud we must change our mind set, know the public cloud and prepare.
It is important that we build a public cloud Eco-System for every companies especially if it is Public Cloud service provider. So be it Azure or AWS, they do not do everything for you but they give you multiple options like tools to do it yourself. You still need to think about the monitoring of middle-ware, Databases and applications, backup, disaster recovery, migrations etc. Nobody wants to run just OS on a VM and monitor and backup it up but their workload. They need to move their workloads which need to leverage many other services like those mentioned above.
It may not be possible to a single team to support everything because you need bring specialization as well. If you are part of large service organizations, it makes sense to make partnership with your existing traditional team to come up with cloud-native approach to extend their support to Cloud Platform. It would help them to up-skill and at the same time company can grow faster to support dynamic requirements from customers.
We often create dependencies to protect ourselves and show our value in the team. I have seen people go on vacation and his/her colleagues struggle to back him/her up during his vacation due to poor hand over
and knowledge sharing. That leaves the client and leadership team says, “You cannot run it without Tommy” (Tommy is just imaginary Mr. Depended). This isn’t good for the organization, team members and himself. It is the negative way, I wouldn’t want work with such people, and I would not rate Tommy high no matter he is doing his assigned work well. Continue reading “I did it, we will do it together now – Dependency Created.”
Azure dedicated host will enable you to run your organization’s Linux and Windows virtual machines on single-tenant physical servers. It helps you to provide with visibility and control to help address corporate compliance and regulatory requirements.
You can find the documentation from Microsoft here.
AWS had this feature available from some time now, it is good that Microsoft also catching up and closing the gap.
Benefits of Dedicated Hosts.
Azure Hybrid benefit to Azure Dedicated Hosts – Microsoft offers on-premise Windows
Host level isolation
Underlying hardware infrastructure
Processor brand, capabilities and more
Type and size of the Azure
With an Azure Dedicated Host, you can control all host-level platform maintenance initiated by Azure (e.g., host OS updates). An Azure Dedicated Host gives you the option to defer host maintenance operations and apply them within a defined maintenance window, 35 days. During this self-maintenance window, you can apply maintenance to your hosts at your convenience, thus gaining full control over the sequence and velocity of the maintenance process. Continue reading “Review of Preview – Azure Dedicated Host”
How can we securely connect to the Azure VMs from Internet without assigning the Public IP to a VM. It is not recommended at all to use Public IP. So, I think Azure BastionHost is a welcome move from Microsoft to help their customer especially for administrators. I am not sure how many of you like using Point to Site.
Note: There are many secured way of connecting to Azure by using different services including market place solution. I am not doing comparison of those in this blog.
I do not want to write the same stuff available in the Microsoft documentation in this blog. However, I would like to talk about my findings and how useful it would be based on my trail run.
I have explained in my previous blog about saving cost using RI and how we can plan. Let’s looks at the combinations of Runbooks and Reserved instances in this blog to save the cost further. We should be careful to get the cost savings out of this model.
You can power off the VMs in non-production environment to save cost when not in use. You can use the runbooks to schedule the Power on and off of the VMs. The below table will provides an insight on what level of cost saving can be brought using runbooks.
If you power off your VMs after your working hours and holidays you would be saving almost up to 76%. I have placed the 3 VMs in different time duration to explain the RI calculation later in this blog. I considered 22 days in a month which excludes weekends
Table 1 – Saving using Runbooks
However, if you consider public holidays, it would be more savings for sure. The runbooks provide cost savings more than 1-year RI and 3 years RI which are 21% and 34% respectively. This proves that RIs do not suit for your non-production environment because runbooks give you more cost savings than RI. So, you may consider RIs only for the production environment unless you see good amount of usage in the non-production environment. Continue reading “Save Cost with combinations of Runbooks and Reserved Instances”