One customer journey – One Solution – Thousands of connectors.

Most Connectors win!

Microsoft announced recently that they surpassed 1000 connectors. This might not mean much to many, but successful digital transformation projects are oftentimes riddled with non-working integrations, especially during transitional architectures. To have access to vendor-supported cross-application integration connectors is oftentimes not only a selling point long-term but also what is required to meet the demand of time to market. No one around today wants to invest in their own bespoke integrations unless it is necessary.

Dynamics 365 and the rest of the Microsoft Application stack connect today to any known application. Either by an application-specific connector or via a more generic connector.

What is a connector?

There are heaps and heaps again of different types of connectors. But within Microsoft, we commonly refer to the Azure or power platform connectors.

The integration of Microsoft Dynamics 365 with connectors provides businesses with a powerful platform to automate their business processes, streamline their workflows, and enhance their overall productivity. The combination of Dynamics 365 and connectors allows businesses to seamlessly integrate their cloud-based and on-premise applications and enables data exchange between these systems in real time.

What are Azure Connectors?

Azure Connectors are a set of pre-built, managed, and configurable services that provide a fast and secure way to integrate applications and services. These connectors provide an easy-to-use interface for exchanging data between systems, and are designed to work seamlessly with Azure services and other cloud-based and on-premise applications. With Azure Connectors, businesses can eliminate the need for custom integration code and reduce the time required to integrate systems, which ultimately leads to increased efficiency and cost savings.

How do Azure Connectors work with Dynamics 365?

Azure Connectors provide a bridge between Dynamics 365 and other applications and services, enabling the exchange of data in real-time. The connectors can be used to integrate Dynamics 365 with a variety of applications, including SharePoint, OneDrive, Excel, and many more. With the use of Azure connectors, businesses can easily connect Dynamics 365 to their existing applications and services, allowing them to automate workflows, streamline processes, and enhance the overall productivity of their organization.

What are the benefits of using Azure Connectors with Dynamics 365?

  1. Increased Efficiency: Integrating Dynamics 365 with Azure Connectors eliminates the need for manual data entry and reduces the time required to complete tasks. This increased efficiency enables businesses to focus on more important business activities, leading to enhanced productivity and cost savings.
  2. Improved Data Quality: Azure Connectors provide a secure and reliable way to exchange data between systems, ensuring that data is accurate and up-to-date. This improved data quality provides businesses with a single source of truth, which can be used to make informed business decisions.
  3. Enhanced Collaboration: Integrating Dynamics 365 with Azure Connectors enables businesses to collaborate more effectively by providing access to real-time data from multiple systems. This enhanced collaboration leads to improved teamwork and better decision-making.
  4. Customizable Workflows: Azure Connectors allow businesses to automate workflows, streamline processes, and enhance the overall efficiency of their organization. The connectors can be configured to meet the specific needs of each business, providing a tailored solution optimized for their unique requirements.
  5. Enterprise Support: Whenever you need a supporting hand – Microsoft has 24/7 support for all connectors.
  6. Monitoring: Built-in monitoring is a click away and with no extra costs. Get alerts and track trends in the usage of your connectors.
  7. Robust: Built-in robustness in retry and throttling of connectors exist out-of-box.


In conclusion, integrating Microsoft Dynamics 365 with Azure Connectors gives businesses a powerful platform to automate their business processes, streamline their workflows, and enhance their overall productivity. With the use of Azure Connectors, businesses can easily connect Dynamics 365 to their existing applications and services, allowing them to automate workflows, streamline processes, and enhance the overall efficiency of their organization. The benefits of using Azure Connectors with Dynamics 365 include increased efficiency, improved data quality, enhanced collaboration, and customizable workflows. We can do all this for a fraction of the cost and with full support from Microsoft.

Become an expert using these connectors, and you will be a valuable resource going forward in any size digital transformation project!

A morning like any other morning!

Waking up at 6.17 AM was not by accident. My personAId Xris (pronounced Chris) awoke me at the end of my rem cycle, knowing that I would still have enough time for my obligations this regular Tuesday.

I frown briefly thinking of the nanosec pricing for the publicly accessible quantum computing services. Gathering daily insights from my 17 petabytes of demeanor data (all my past data interactions) costs about 3,23 nanosecs. Xris requires these ongoing data analytics to have the insights needed to act as my assistant and business partner.

Tuesday has physical exercise first on the agenda. I put my AR glasses on and start running down the familiar island trail. Conversations from ongoing projects are hovering transparently and steadily in front of my field of vision. I whisper commands (voice commands only audible to Xris) to give me the necessary updates.

I get a call while listening to Xris highlighting the sentiments from today’s conversations and essential excerpts. The priority call from my manager is allowed directly to me without being greeted by my conversational bot. I have been running for 20 minutes, so Xris modulates my voice to disguise the shortness of breath.

I’m asked to join an urgent meeting to discuss some last-minute project changes. I realized 5 minutes into the meeting that I was merely there as a listener and did not have any input to share, so I switched to bot-mode despite company policy. The steady increase of non-physical meeting participation using bots as stand-ins has encouraged many enterprises to enforce meeting ethics disallowing unannounced botmodes.

In bot-mode, Xris can track conversation and sentiments to foresee possible required participation on my part. I would get an early warning and a recap before rejoining the meeting. Instead, I get into the shower listening to the last episode of my favorite podcast.

This might all sound like sci-fi, but all technology envisioned in previous paragraphs, are in fact, very much a reality. But not yet part of mainstream applications.

My ambition was to get you thinking about the art of possible. The only way for this to ever become applications sought after is by implementing them into our next project and it all starts in your mind.

You don’t need to be a large enterprise to utilize the latest conversational intelligence IVR solutions from #Nuance with your #Dynamics365 Customer Service application. Nor does it require a hundred plus customer service agent organizations to reach break-even enabling virtual assistants.

We might not get Xris today, but we do get an experience that will allow organizations to spend their money on development saved from the reduced handling costs covered by the self-services that virtual assistants provide.

For all geeks out there like me, check out Microsoft’s development framework investing in supporting virtual assistants in your environments. Why not make your next customer experience project, whether internal or customer-facing – include a virtual assistant in the teams channel answering all those reoccurring queries?

Hoping to see many examples and ideas in the upcoming #MicrosoftBuild

Don’t forget to register!


D365 CE + FO + Large transactional volumes

When integrating Dynamics 365 CE (Customer Engagement) Platform which is based on Microsoft Dataverse with D365 FO (Finance and Operations) we have a few options.

The suggested Microsoft method is to use Dual-write if we require synchronous replication of data between those two applications. Dual-write has a lot of out-of-box capabilities two accomplish a functioning link between D365 CE and FO – but in my experience, this works only well where the use case does not involve a large number of transactions over a short time frame.

The integration templates include a lot of different tables and there are many standard integrations out-of-box for both D365 Sales and Field Service. But as soon as you wish to adjust and add then you will immediately hit various challenges.

You will learn that the Dual write is still a very immature product and has still bugs that you will need to understand and create workarounds for. You will realize that the GUI has limitations and you’ll learn how to deal with those. You will appreciate that many of the D365 FO entities do not support Dualwrite because it lacks the possibility to access data via the odata web API.

But when you finally overcome all these teething issues I believe Dual-Write is struggling with – then you need to appreciate the following do’s and do not’s that Microsoft has manifested.

Out from our previous enterprise definition – we would not see many use cases fitting the above list.

Especially you would most often have multiple instances for D365 CE.

But even if you do fulfill the above list you’d soon hit a wall in terms of large volumes.

Not because dual write has issues dealing with large volumes – but for the fact that I would always try to avoid synchronous solutions for large volumes. I’d always try to design large transactional flows using a decoupled model. There are multiple scenarios where data can fail to (CRUD) create, read, updated, or deleted. We need to take this into account and have ways to deal with these failures and allow the users to make guided decisions to rectify data and proceed.

To analyze and rectify issues with data synch using dual write is truly a technical administrative task.

In addition, we need to consider the API service protection limits with Dynamics 365 CE. Dual-write does not consider other sources or logic for changing the same data and therefore it is a pain to have centralized control of all the CRUDs.

So still as of today 29th April 2022 – I would use other technologies than dual write in any given enterprise scenario.

Dynamics 365 CE 99.9% Enterprise Support

As the title suggests, when dealing with D365 CE deployment projects, we expect it to fit organizational requirements to close to 99.9%.

No I am not talking about the SLA from december 2021.

I’m trying to quantify the required features for admin, developers, and users we would expect in enterprise scenarios. Maybe not exactly that amount – and to be fair I haven’t done the calculations. But my point is that there are a few important things one needs to consider running an enterprise-scale D365 CE project. Let’s start from the top.

What do we consider being Enterprise. As a enterprise architect I would say it is the governing realm of all business processes, people, solutions and data we require to deliver a specified set of services or products to a given market. In essence this is either a company, group of companies or a business unit within a company.

But for this article, the enterprise also emphasizes large volumes of transactions, traceability requirements, risk mitigation, scalability, Application Lifecycle support, etc. The list goes on, but all refer to the variables that occur within companies doing extensive scale activities, combining many people across many systems and solutions.

For this article, I have focused on development and large amounts of transactions being sent to and from D365 Customer Engagement.

Firstly the requirement of dealing with large amounts of transactions.

It is not unusual for large organization to deal with 100 000s of transactions daily. Transactions ranging from monetary i.e. sales orders to IT administrational such as audit logs.

Usually, you’d have a requirement of ensuring 100% delivery of these transactions. Regardless if it is receiving or sending. To accomplish this, you need to make sure you always ensure transactions are routed and created correctly. There are many ways of achieving just that. But in simple terms – we want to avoid failure scenarios causing data loss. For the D365 Customer Engagement platform – apart from apparent bugs in logic, these failure scenarios are most commonly caused by high-level data mismatches. For example, if you use incorrect parameters in lookup fields, causing records to fail to be created.
But ever so common, you also hit design limits in your technology. The most common technology used for transacting data in and out of the Dynamics 365 Customer Engagement platform is the standard API provided by Microsoft. These service protection limits are there to safeguard the Dynamics 365 CE platform from misuse – ultimately rendering the entire service unusable.

The limits have changed over time but currently it is limited to one user performing 8000 service calls to the API within a 5 minute sliding window. If we were to misuse this – the service responds with a 429 error.

Naturally, Microsoft offers many ways to deal with this, and most of them will eventually provide a solution. You can add more users, segment the calls into batches, throttle the speed of calls to the API, etc. But essentially, you need to be proactive! Microsoft will not do this for you – this is something you need to plan for developing your solutions and logic.

So is this enough? Are we there? Sorry – we aren’t.

On top of all our efforts, we also need to understand that we are not alone in providing our service as a business consultant using the Microsoft platform. Essentially we are partnering with Microsoft to deliver the service. Using cloud services means Microsoft employs most of your applications and infrastructure specialists. We also need to appreciate that services in the cloud are also hardware somewhere that needs to be correctly tuned to fit our requirements. The latter isn’t always plug n play!

I often see services such as Azure logic apps or Dynamics 365 CE API stop responding. Logic calls the API and dataverse responds with 400.

In a recent project, we had these issues 1-2 times out of 10 000 records. It is enough to cause grave problems for an enterprise and must be dealt with. Contacting Microsoft support, you’ll learn that the only fix is to adjust the resources in their backend. So nothing you can control in advance unless you take your chances and lower the rate of transactions per minute. But by doing so – you’d never know and control other areas of logic calling the same service. So my advice to deal with these types of issues…..always do stress testing….so you do not need to stress.

Further on to the requirements gap for delivering a functioning ALM with Dynamics 365 Customer Engagement.

I am a true believer in devops and I use Azure DevOps extensively and sometimes it is the only place for all my project activites and documentation. In my previous blog article I wrote about manual intervention. It is what I use in my releases via ADO pipelines in order to complete the set of activities required where D365 CE falls short of providing programmtically ways of changing or applying logic to the D365 CE environment. Below I would like to address those activities that I wish had better ways to manipulate using codes or scripts.

Fiscal Year – required setting for most companies but not possible to set either via the API or powershell. You have to go somewhere and press a few buttons.

App feature settings such as Export to PDF etc – you have to logon to the app and choose entities used for pdt exports.

Turn of preview features such as enhanced product experience must be done manually.

Almost all settings in powerportal admin center such as dataverse search on, audit log settings.

Register webhooks via the plug in registration tool must be done manually.

So unfortunately as of today – there are a few things hindering us from delivering a complete solution offering zero touch ALM processes dealing with the Dynamics 365 Customer Engagement.

In DevOps we trust

In DevOps we trust – to not become fools in love!

Recently I was challenged as a solution architect to deliver a fairly large project with a fixed hard date for go live. The deliver was far from out of the box and required thousands of development hours. With that said I knew there were very little margin for errors in the various stage release processes. So, I decided to yet again revisit the possibility to deliver a full fledge CI/CD configuration via Azure DevOps for Dynamics 365 Sales and Azure services. The spoiler is that I might have a crush but I am yet not in love!

Now to why I still think we have some more land to cover before I am smiling through the whole sentence proclaiming the ALM possibilities with Dynamics 365 Sales.

First of my ambition was to avoid manual steps in the deployment process. This is in my view not obligatory in CI/CD but best practice. Adding manual steps into any process is similar to the famous “broken window” policy. The broken windows policy simply explains that crime could start with the overseeing the simplest degradation in society. First it is a broken window…which ultimately could lead to a chain of events allowing the culprit to nestle its way into a controlled environment and do harm. Same goes for many processes and ALM is no different. Which leads me to my first problem with the current ALM possibilities and Dynamics 365 Sales.

It is not possible to automate all parts of Dynamics 365 Sales programatically!

But all is not lost of course. There is a feature within Azure DevOps Pipeline called manual interventions. This is by far not a new feature and not something unique for Azure DevOps.

Please reference to Microsoft Docs here

Basically, what it does is pausing your current pipeline flow and let you continue automation after you have manually deployed or made changes outside the automation within the Pipeline. This is in my view ingenious and simple at the same time. They usually go hand in hand.

This merge of manual and automation is something oftentimes missing in process-oriented solution systems.

I would love to see this type of behavioral input possibilities in other systems without requiring development and customization.

So to summarize – Dynamics 365 Sales + Azure Devops + Manual Interventions = almost complete ALM 😊

Magnus Oxenwaldt

Digital Transformations and Enterprise Architect enthusiast


Magnus has a diversified background working in various business areas and in many different roles. Holds degrees both in technology, economics and marketing.

Currently works as a Executive Business Consultant with Columbus Global.

Originally from Sweden but have worked globally and current office location Oslo, Norway.

Standard Format with a Featured Image

I faced about again, and rushed towards the approaching Martian, rushed right down the gravelly beach and headlong into the water. Others did the same. A boatload of people putting back came leaping out as I rushed past. The stones under my feet were muddy and slippery, and the river was so low that I ran perhaps twenty feet scarcely waist-deep. Then, as the Martian towered overhead scarcely a couple of hundred yards away, I flung myself forward under the surface. The splashes of the people in the boats leaping into the river sounded like thunderclaps in my ears. People were landing hastily on both sides of the river. But the Martian machine took no more notice for the moment of the people running this way and that than a man would of the confusion of ants in a nest against which his foot has kicked. When, half suffocated, I raised my head above water, the Martian’s hood pointed at the batteries that were still firing across the river, and as it advanced it swung loose what must have been the generator of the Heat-Ray.

No, but I am their friend, although I live in the land of the North. When they saw the Witch of the East was dead the Munchkins sent a swift messenger to me, and I came at once. I am the Witch of the North.

To be sure, the broad river now cut them off from this beautiful land. But the raft was nearly done, and after the Tin Woodman had cut a few more logs and fastened them together with wooden pins, they were ready to start. Dorothy sat down in the middle of the raft and held Toto in her arms. When the Cowardly Lion stepped upon the raft it tipped badly, for he was big and heavy; but the Scarecrow and the Tin Woodman stood upon the other end to steady it, and they had long poles in their hands to push the raft through the water.

Standard Format without a Featured Image

As I watched, the planet seemed to grow larger and smaller and to advance and recede, but that was simply that my eye was tired. Forty millions of miles it was from us–more than forty millions of miles of void. Few people realise the immensity of vacancy in which the dust of the material universe swims.

Near it in the field, I remember, were three faint points of light, three telescopic stars infinitely remote, and all around it was the unfathomable darkness of empty space. You know how that blackness looks on a frosty starlight night. In a telescope it seems far profounder. And invisible to me because it was so remote and small, flying swiftly and steadily towards me across that incredible distance, drawing nearer every minute by so many thousands of miles, came the Thing they were sending us, the Thing that was to bring so much struggle and calamity and death to the earth. I never dreamed of it then as I watched; no one on earth dreamed of that unerring missile.

That night, too, there was another jetting out of gas from the distant planet. I saw it. A reddish flash at the edge, the slightest projection of the outline just as the chronometer struck midnight; and at that I told Ogilvy and he took my place. The night was warm and I was thirsty, and I went stretching my legs clumsily and feeling my way in the darkness, to the little table where the siphon stood, while Ogilvy exclaimed at the streamer of gas that came out towards us.

That night another invisible missile started on its way to the earth from Mars, just a second or so under twenty-four hours after the first one. I remember how I sat on the table there in the blackness, with patches of green and crimson swimming before my eyes. I wished I had a light to smoke by, little suspecting the meaning of the minute gleam I had seen and all that it would presently bring me. Ogilvy watched till one, and then gave it up; and we lit the lantern and walked over to his house. Down below in the darkness were Ottershaw and Chertsey and all their hundreds of people, sleeping in peace.