Demystifying cloud technology for everyone

GraphAPI – Filter based on ExtensionAttribute

Even though GraphAPI is nice and amaze with the throughput, the filtering part is always looks more complex for me. This is especially when it becomes a complex query. Hence, I thought of sharing my experiments related GraphAPI filtering.

Lets look at the one I choose today. In the event where we have a custom attribute from OnPrem which is getting synchronized to Azure AD - We may need to filter the objects based on the value from custom attribute. Filtering is supported and here is how we do it.

Get-MgUser -Filter "onPremisesExtensionAttributes/ExtensionAttribute1 eq 'Value'" -ConsistencyLevel eventual -Count userCount

Note that ConsistencyLevel and Count is mandatory to get this working. Else, You may get the below error.

Get-MgUser_List: Unsupported or invalid query filter clause specified for property 'extensionAttribute1' of resource 'User'.

Microsoft Reference - https://learn.microsoft.com/en-us/graph/filter-query-parameter

Posted by Shabarinath, 0 comments

(Get-Date).AddDays along with Format switch

We commonly use the switch -Format along with Get-Date commandlet to adjust the date format. However, its not really possible to club date format change along with .adddays() method.

(Get-Date -Format yyyyMMdd).AddDays(x)

This is because of the fact that Format switch will change the type to "STRING" while changing the date format. And .adddays() is a method for the type DateTime.

While the commandlet is executing, Formatting happens first and then trying to use the method .adddays(). Which will fail as expected.

Method invocation failed because [System.String] does not contain a method named 'AddDays'.

The error message is pretty straight. Method AddDays is not found within [Systems.String]. So the question is how to deal with this? The traditional option is to store the changed date to a variable first and then do the formatting as the second step.

So lets look at some more ways to accomplish in a single line. Possible?

  1. Call the Get-Date command along with adddays method inside Get-Date commandlet with formatting

2. Use method .adddays first and then .ToString

This is simple, but which one is better?

(Get-Date).AddDays(x).ToString('yyyyMMdd') is slightly better as its approximately one millisecond quicker to finish.

Here are some intresting artcles to read on Get-Date and the options around it



Posted by Shabarinath in PowerShell, 0 comments

Microsoft Sentinel – Configuring Windows Server using Azure Monitoring Agent

Using legacy agent, as discussed on the previous post is one option for configuring Windows servers for with Sentinel. As the name states, its a legacy option and has its own limitation. Last year, Microsoft came up with a new approach by making use of Azure Monitoring Agents. With this approach, the collected logs are sent to Azure Log Analytics Workspace, which is shared between Azure Monitoring as well as Microsoft Sentinel. The same data is also available for Microsoft Defender. As per Microsoft, all legacy agents currently in use will be replaced with Azure Monitoring Agent and the legacy agents are expected to retire on 31st Aug 2024.

Configuration is almost similar, when compared with the legacy agent installation. We need to do few more additional steps to complete the setup.

From the Sentinel Dashboard, Navigate to Data Connectors.

Search for "Windows Security Events via AMA" and select the connector from search result.

Click on Open Connector Page from the bottom right corner.

On the connector page, the difference compared to the legacy agent installation is that we don't see any agent details. Instead, We have a messaged marked as an information which says that To collect data from non-azure VMs, they must have Azure Arc installed and enabled.

So we need to navigate to the Azure ARC portal. Azure Arc is a set of technologies that brings Azure security and cloud-native services to hybrid and multi cloud environments. From Azure Portal, search for "Azure ARC" and navigate to the Azure ARC portal. Select servers at the menu placed on left side.

Click on the + Add from the top menu.

Now, page will get redirected to "Add servers with Azure Arc". This page has multiple options on deploying ARC Agent. The bottom-line is that "Azure Connected Agent" needs to get installed either manually or through script. Once installation is done, Need to get the agent connected to the Azure Tenant. If its a single server, Generate a script will help to create the script customized with the values specific to your environment.

Navigate through the initial screens. The final page will generate the script as well as gives the option for registering the subscription.

Registering is a one time activity. Just click on register and thats it.

Now, Copy the script and run it from the onprem server.

This script will do the following :

  1. Download the agent from the Microsoft Download Center.
  2. Install the agent on the server.
  3. Create the Azure Arc-enabled server resource and associate it with the agent.

https://aka.ms/AzureConnectedMachineAgentIf internet connectivity is not available from the server, get the msi installer downloaded first from here and get them copied to the respective onprem servers. The actual registration is happening by running azcmagent.exe, which is located in C:\Program Files\AzureConnectedMachineAgent. Extract that part alone and can do a manual installation if that's required.

Running the script from PowerShell console is easy.

The script will halt in the middle to complete the device login and once authenticated, We can get the rules configured on Azure Sentinel.

Navigate to Azure Portal -> Sentinel -> Data Connectors -> Windows Security Events via AMA

Click on Open Connector page at the bottom right corner

Click on "+Create data collection rule" on the configuration

Enter the Rule Name, Subscription and Resource Group.

Click on Next

On the resources page, select the onprem computer

Click on Next

Select the events to stream.

Complete the validation and go ahead with creation

The new data collection rule will be visible on the configuration section

Events will start populating after a while.

Good luck !

Posted by Shabarinath in Microsoft Sentinel, 0 comments

Microsoft Sentinel – How to configure on-premise domain controller?

Microsoft Sentinel provides two different options on collecting security logs from an on-premise server. Here is the simple straight way using legacy agent option.

Navigate to the Data Connectors from Azure Sentinel dashboard.

Search for "Security Events via Legacy Agent" and click on "Open Connector" at the bottom right corner.

Download the agent from the link provided. Microsoft has two different agent types - One for VMs hosted in Azure and the other for VMs hosted in all other environments other than azure.

On the events to stream option, Select the appropriate one. Since I was not sure about the full list of events in different categories, I opted for "All Events".

On the download agent page, We have the 32bit agent as well as 64bit agent. And the workspace ID and the key. Workspace ID and the keys are used while installing the agent on the on-premise server. If the server doesn't have a direct internet connectivity, Microsoft gives us an option to use a gateway server. All on-premise servers will send logs to Log Analytics Gateway server and gateway server will then send the logs to Azure Log Analytics Workspace. Since I did this on my lab environment, I have unrestricted internet for the domain controllers. Hence, the events are directly sent to Azure LA workspace.

Now, navigate to the server. Get the agent downloaded and keep the workspace ID and the key handy. Primary key or secondary key - either one is required.

This agent is basically MOM agent with additional functionality for integrating with Azure Log Analytics.

On the welcome screen, click next. Accept the license agreement and click next.

Choose the installation folder

Choose "Connect the agent to Azure Log Analytics (OMS)" and click next

Now, Its the time to configure the workspace ID and the key. In case a proxy is used, configure the proxy settings too using the Advanced button.

Choose the Windows update option as per your wish and go ahead with the installation.

Once installation is complete, Verify the agent health from Control Panel -> Microsoft Monitoring Agent. If the connectivity is fine, A green tick mark will be visible on the status column.

In few minutes, the changes will be reflected on the azure sentinel connector page.

One of the main concerns on using legacy agent is on the time required on getting the events in Azure Log analytics. I did a small test to verify this and it was quick.

Here is the raw event from Windows Security event.

I then searched the Azure Log Analytics workspace. For the respective event, Microsoft is also stamping an additional filed called time collected. Time generated is the actual time event was triggered. Both fields are in UTC. If you see, the event got collected with in a minute once it was generated. And it was available in the Azure Log Analytics workspace with in 3-4 minutes.

Refer the Azure Sentinel data collection best practices here.

Posted by Shabarinath in Microsoft Sentinel, 0 comments

How to learn KQL using a live lab?

"KQL" came to limelight around 2020. I started using KQL while working with Log Analytics Workspace in Azure, though it was just basic queries. However, I was not that keen to learn at that point. Over a period of time, KQL is gaining prominence.

Don't get confused with Kibana Query Language as both are referred as KQL in short. Kusto Query Language is a powerful tool to explore your data and discover patterns, identify anomalies and outliers, create statistical modeling, and more.

Kusto is basically a data analytics cloud platform, optimized for interactive, adhoc queries over structured, semi-structured and unstructured data. Came as an internal tool and later became available for public as "Azure Data Explorer" in 2018. Azure Data Explorer was built on top of Kusto. Kusto Query Language is now used with multiple services with in Azure. The key one is Microsoft Sentinel.

Microsoft provides an lab environment for anyone who don't have access to Azure to play around.

Give a try @ https://aka.ms/LADemo

KQL Log Analytics

Another option which can be used is Application Insight Demo platform.

Give a try @ https://aka.ms/AIAnalyticsDemo

KQL Application Insight

Both demo environment require a Microsoft account however you can play around and learn the KQL without any worry.

Good luck !

Posted by Shabarinath in Azure, KQL, 0 comments

Install-Module : A parameter cannot be found that matches parameter name ‘AllowPrerelease’.

Microsoft is releasing new versions of PowerShell modules frequently now. And the real truth is many of the commandlets are having bugs these days than the earlier days (my personal opinion, not based on any statistics). In parallel, new services are getting launched and new modules are coming up to support them. Many times, We are forced to use the pre-release version to see if bug is getting fixed on the upcoming version.

The most common approach is to install the latest version directly from online repository if the client machines has access to internet. And the commandlet to be used is Install-Module with an additional switch -AllowPrerelease. However, Its common to end up with the below error, especially for the newer versions.

Install-Module : A parameter cannot be found that matches parameter name 'AllowPrerelease'.
At line:1 char:37

  • Install-Module -Name MicrosoftTeams -AllowPrerelease
  • ~~~~
    • CategoryInfo : InvalidArgument: (:) [Install-Module], ParameterBindingException
    • FullyQualifiedErrorId : NamedParameterNotFound,Install-Module

The commandlet is not accepting the switch -AllowPrerelease.

As per the release notes, Minimum PowerShell version is 5.1 and I was running 5.1.17763.1971 on my server, still ended up with this error.


Go ahead and Install Powershell 7.x.

You can install pre-release versions through Powershell 7. -PreRelease switch works fine with Powershell 7.

An additional error will come up to include -Force to install a pre-release version. With that, We are good to go !

Posted by Shabarinath in PowerShell, 0 comments

Global Admin vs Company Administrator – Naming Standardization Required

Recently, I was trying to list down the Global Administrators on my test tenant. The quick option I had was to query the group membership as we do in on-premise active directory environment. But then, I realized that the tenant admins doesnt have access to these role groups as if we access normal groups.

Office365 provides lot many Role Groups by default. Role Groups are relying on Azure AD groups, but restricted for tenant admins to access directly. So we cannot access them using the same way of accessing groups using Azure AD portal or the commandlet "Get-AzureADGroup". However, Microsoft has provisioned multiple options to access the role groups as well as role group membership.

Here is the first option using MSOL commandlet Get-MsolRoleMember

Get-MsolRoleMember -RoleObjectId "62e90394-69f5-4237-9190-012177145e10"

We should give the role object ID to list the role group member. The catch here is on the names. If we look for Global Administrators on the result of Get-MsolRole, We cannot see one. On GUI, We have Global Administrators but on PowerShell, Its called as Company Administrators. 🙂

The next option is to query the role through AzureAD module. For that, We use the commandlet Get-AzureADDirectoryRoleMember and pass the objectID of respective role group as the parameter.

Get-AzureADDirectoryRoleMember -ObjectId 2e6d232a-5bbd-4643-9ad2-bfd899258406

The twist here is on the group name. The object ID should be grabbed from Get-AzureADDirectoryRole and we should look for the role group "Global Administrators" here. Not company administrators :D.

I hope Microsoft will eventually standardize the name to Global Administrators everywhere.

Posted by Shabarinath in AzureAd, 0 comments

Microsoft Team – SharePoint Online – M365 Group – How all are getting connected?

Teams is one of the Office365 product which got acceptance with in a very short span of time. The main reason Teams is still considered as a great product is due to the tight integration with different products, thus enabling a real collaborative suit. This perspective is from an end user perspective. Teams eventually become prominent than SharePoint, even though Teams still consume services from SharePoint at different places.

The more integration, IT Administrators needs to be having good understanding for managing the service. With SAAS or PAAS models, Integrations are more happening from backend keeping things invisible for IT Admins. Along with this, the admin console/PowerShell modules are still evolving from Microsoft Teams. Hence, Its vital for IT Admins to get good level of insight to make sure that none of the integrations are broken due to a change.

Lets have a look into this with a real example.

I have created a Team named "Finance" on the Office365 Tenant. Team get provisioned and along with that, there are few other components getting provisioned in the background.

  1. Unified Group
  2. SharePoint Site

Lets have a look on the attributes visible for admins for the new Team using the commandlet Get-Team. Make a note of the GroupID attribute and its value.

Now, Lets look at the Unified Group. Look for the attribute ExternalDirectoryObjectId.

Lets move on to SharePoint site. Look at the attribute GroupID.

It seems GroupID is used to integrate Team with Microsoft 365 Groups as well as SharePoint Online site. We dont have an option to update these attributes directly through shell and hence, any change happening on any of these services which are interconnected needs to be done with a note of caution.

Posted by Shabarinath in Teams, 0 comments

GoDaddy Office365 to Microsoft Office365 Migration- Part 14

Post Migration Activities

We have the domain name added to the new Office365 tenant successfully. Along with that, DNS records are also updated/adjusted. However, That's not enough to resume the mail flow. Once the tenant is added to the Office365 tenant, Exchange Online Protection, which is the email gateway enabled with various security controls can start processing inbound emails. However, The emails will not get delivered as the SMTP address is yet to be added to respective mailbox users. Hence, the next activity which should be done immediately is to add the smtp address of your organization and make it as the primary address. In my test tenant, I need my test users to have @activedirectory.in email address added and make it primary.

As a best practice, UPN should be matching the primary SMTP. Hence, both UPN and primary address needs to be adjusted first.

For changing UPN

Set-MsolUserPrincipalName -UserPrincipalName shabarinath@activeidirectoryin.onmicrosoft.com -NewUserPrincipalName shabarinath@activedirectory.in

With UPN and email address changes, Mail flow should fully resume. With in Office365 tenants, Maiflow will be resumed in 15-30 minutes usually. External email services will also resume after this.

Teams External users

Gust users can be invited at this stage. This should be done manually as PowerShell commandlet does not support adding external users to team or team channel. At the time of writing this post, The only two option availalbe for the switch role is Member and Owner.

 Export Calendar/Contact/Tasks

If calendar/contacts/tasks are exported as a final sync, use import option in Outlook to import the same PST.

Rearranging Mailbox Folders

If PST Import was used, the folders imported will be residing in a different folder structure. However, Don't Panic. 

1. Moving subfolders with in standard folders - Use Outlook in Online Mode. Move subfolders first. Dont use Copy.  Eg for a subfolder is Inbox\Customer1. This movement is easy and will be quick.

2. Items inside standard folders - Use Outlook in Online Mode. Move items in a group of 5K items per move. Again, Don't COPY PASTE. This is due to the fact that Outlook movement may fail some time. if we opt for Copy-Paste, we need to be sure on which all mails got copied. Else, Mails will get duplicated. Standard folders are Inbox, Sent Items, Drafts, Deleted Items, Archive, Junk Box etc which are created by O365 at the time of provisioning.

3. Moving non standard folders on Root. Use Outlook in Online mode. Move each folder at a time. Use Online Mode and movement will be quick.

Once movement is completed, We also need to clear off the folder structure created as part of the PST Import. This should be done through Outlook Web  Access. Delete the root of imported folders and then go to Deleted Items to clear it off from Deleted Items too.


Stamp LegacyExchangeDN used at source mailbox as X500 address - Stamp LegacyExchangeDN and stamp as X500 attribute . 

With that, We are good to handover user credentials. Make sure that users has FAQ documents on how to configure Outlook, Mobile etc after roll out. So prepare one in details and share with the end users in advance. Ask them to keep a print out.

This is my last post on this series. Hope it was helpful.

Good luck !

posts on office365 cross tenant migration

GoDaddy Office365 to Microsoft Office365 Migration- Part 1

Migration Overview

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 4

Migration Tools

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 1

Initial Assessment

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 5

Gathering Details

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 3


Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 6


Click Here!
Posted by Shabarinath in Office365Migration

GoDaddy Office365 to Microsoft Office365 Migration- Part 13

Delete GoDaddy tenant on cut over day?

Before we get ready for the cut over, This is an important question to be clarified. Once we have the data syncronized, the final step is to cut over the smtp domain from GoDaddy tenant to Microsoft Tenant. However, Its not as easy as what we think. GoDaddy has a different setup of Office365 which doesnt have Office365 admin center. Admin center is replaced by GoDaddy admin center which has very limited options.

Challenge - GoDaddy tenant needs an admin account which is created using the GoDaddy admin center. And we dont have an option to create a user with default onmicrosoft.com ID here. As a prerequisite for deleting the domain name from Office365 tenant, Its mandatory that all recipients who are using this domain name must delete them prior to the removal. This is like a dead lock. We need to change the SMTP domain, however retain the admin access too.

If thats the case, We should have an additional SMTP domain to be used in interim. All accounts needs to have SMTP address changed from activedirectory.in to interimdomainname.in so that we can retain the users as well as the data on GoDaddy tenant even after cut over.

If an interim domain is not an option - then the only option is to delete all the user objects from GoDaddy admin center and call GoDaddy Support requesting them to drop the tenant.

Deleting Domain

Once the smtp domain has released from recipients, We can call GoDaddy support or use PowerShell to remove the domain from GoDaddy Office365 Tenant.

Use Connect-MsolService to connect with Microsoft Online Service
Use Remove-MSOLDomain to delete the domain

We can also use Get-MSOLUser -Domain yourdomain.com to identify the objects which are currently using the domain name (even as a secondary SMTP address)

Once Domain Deletion is successful, Please give 15 minutes to 30 minuets before re adding the domain on Microsoft tenant.

For adding the domain, Login to Portal using https://admin.microsoft.com.

Navigate to Settings -> Domain

Click on + Add Domain

DNS Add1

The next step is to verify the ownership.  This can happen using different options. With few of the DNS service providers, Microsoft O365 service can connect directly using the admin credentials of DNS service provider and validate. The other option, which was used from the early days is by creating a dns record, however will need some time for the DNS records to be available for O365 services. Hence, the preferred option is to connect directly and validate.

I was using GoDaddy as my DNS provider for activedirectory.in. Hence, I am getting a prompt to connect with GoDaddy and verify the ownership.

DNS Add2

Once Domain Ownership gets successfully verified, Click on Done.

DNS Add3

The next part is updating DNS recrods. This page will appear once we click on DOne on the previous page. Here we need to choose the services we are availing and the corresponding DNS records will be displayed.

DNS Recrod Addition

Here also, DNS Connect option will help us to make the DNS update jobs easy. Just connect with the admin credentials and all DNS records will get created in one shot.

Once DNS records are created, validation will recheck if everything is good. This could also take some time for DNS records to replicate. Once validation is successfull, We are good to go ahead and start using this domain in the Microsoft Tenant.

posts on office365 cross tenant migration

GoDaddy Office365 to Microsoft Office365 Migration- Part 1

Migration Overview

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 4

Migration Tools

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 7

Onboarding User Objects

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 1

Initial Assessment

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 5

Gathering Details

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 8

Clone Teams and Teams channels

Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 3


Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 6


Click Here!

GoDaddy Office365 to Microsoft Office365 Migration- Part 9


Click Here!
Posted by Shabarinath in Office365Migration
Load more