I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover leading the design process.
A Dynamics 365 / Power Platform Solution architect needs to lead successful implementations, meaning the architect is a key member of the project team. They must demonstrate functional and technical knowledge of Power Platform and Dynamics 365 apps. And beyond that they must appreciate other related technologies that form part of the overall architecture.
This section of the exam is pretty large! I therefore plan to split this information into three posts. This being the second post of three.
In this post I will try to tackle topics under the following headings;
- Identify opportunities for component reuse
- Communicate system design visually
Design Application Lifecycle Management (ALM)
Identify opportunities for component reuse
App components allow reuse within and across apps. For example, an account form created for your sales app may work perfectly well in your marketing and customer service apps. Whilst you could create a separate form for each app, if it works reuse it!
Additionally careful consideration on what components will be included into your apps can allow multiple people to work on building a single app.
Having common components can help promote consistency in the application. And also reduce redundancy.
Plus component reuse results in a final solution that should be easier to maintain.
Custom components (PCF) can be created which might create reusable features. Look for visuals that would benefit from investing in making it a component! (e.g. headers, common widgets etc.)
Canvas app components are targeted at canvas app makers and can only be used in canvas apps. Professional developers can build components using Power Apps Component Framework (PCF). These components can be re-used across your model-driven apps and potentially also canvas apps.
As the Solution Architect designs classic workflows or Power Automate flows, considering when to use child flows maybe required. Child flows allow us to break out parts of the flow into reusable child flows. If using a child Power Automate that you may need to ensure the “current environment connector” is used.
Communicate system design visually
The Architect will often create diagrams to help identify at a high-level how the requirements will be implemented. This can help as useful guide to aid the implementation team create a detailed design. Therefore using diagrams can help communicate the solution design and data architecture in an easy to understand manner to both the customer and wider development team.
Additionally the creation of an overall solution architecture diagram will help identify any opportunities to create proof of concepts / prototypes. When creating a POC you will typically not be creating the entire system so having a visualization to see how this fits into the wider system maybe very useful.
It will also be common to create diagrams to illustrate the data model, often called Entity Relationship Diagrams (ERDs). These show the entities and how they relate to other entities in the same.
Design Application Lifecycle Management (ALM)
Multiple people will be working on the project at the same time and often will be based in different locations. (For example, some of the developers could be based off shore.)
An agreed approach to deployments and testing environments will be required. Do you need separate environments for development, system testing, user acceptance testing, integration testing etc? In a simple scenario the customer may have all their environments within a single tenant. But it is also common for multiple tenants to exist. Therefore deployments need to be from instance to instance which may or may not be located in the same tenant.
Out of the box the Power Platform does not provide version-controlled tracking. ALM processes will be needed to ensure it is clear what components have been deployed to which instances and in what state.
When we talk about environments it may be useful to consider what is (and isn’t) in an environment. Our environment will contain all the Power Apps, Power Automates and Power Virtual Agents that make up our Power Platform solution. Additionally the environment will contain the Dataverse (common data service) and any custom connectors. External to our environment (but maybe connecting to it) will be Power BI, any Azure services and non-customer engagement Dynamics 365 apps. (Such as Finance).
We may also need to consider the location of our environments / data. Microsoft publish details of their business applications and potential locations here.
There are multiple factors that might influence the choice of data location. Including compliance / residency requirements. But also technical constraints such as latency will come into play. Additionally you may find certain applications or features are only available in certain locations. But commonly selecting a location that is close to the majority of users will be preferable. You should also be aware that the location of the tenant (for billing) can be different to the location of the environment and therefore the data location.
Often multiple environments will exist from an ALM point of view. With separate instances for dev, test, prod etc. But other reasons exist such as when you need to isolate data that can’t be co-located. Or if you have conflicting customizations that can’t co-exist. These sorts of scenarios can be common in large organisations with regional business models that differ in approach or maybe have differing compliance requirements.
As a Power Platform Solution Architect we will be expected to lead the establishment of an ALM plan. We may be called upon to evaluate / determine the level of sophistication in our ALM processes that are appropriate for the project. This will no doubt involve working with various teams to support their efforts to implement the selected ALM tools / processes. The Architect may need to consider;
- Environments – how many and what are their purpose?,
- Source code control -where will the master copy f the solutions and code reside? ,
- Devops – what is the workflow for developers and how/who will we promote the app from dev to production?,
- Deployment Configuration – How to configure each environment and what automations can be used to make this process easier?
Traditionally the Power Platform has been environment centric. Meaning changes have been completed in a master environment and promoted from that environment into test, prod etc. An alternatively approach is to be source control centric, meaning the source control system becomes the master. Dev can be re-created from source control, probably via an automated / repeatable process. Changes from dev are therefore checked into source control. Microsoft is encouraging and building tools to support a source control centric ALM approach.
Solutions are the containers that are used to track changes made to the common data service, Power Apps and Power Automate flows. Solutions are used to transport and install changes to target environments. The Microsoft Dynamics 365 apps are deployed into the Power Platform as solutions. Additionally 3rd part apps created by independent solution vendors (ISVs) are also delivered as solutions. Plus internally you will create your own set of solutions as required.
Solutions maybe unmanaged or managed. We typically use unmanaged solutions for development purposes and transport them to test / production environments as managed solutions.
As part of your revision I suggest you ensure you are familiar with the differences between managed and unmanaged solutions. And additionally how solution layering operates to provide the final “solution” visible to the end user. I describe solutions in greater detail in this post.
In any project we may decide to use multiple solutions. But it is advised that you only do this if there is a tangible purpose, as multiple solutions can bring complexity. Not least as dependencies can be created between solutions. Ideally solutions should remain independent and you may wish to create separate environments to ensure this is achieved. When using multiple solutions we could opt for a horizontal or vertical split of components. With a horizonal split we’d have solutions for groups of component types. Say a solution for visual components, another for processes & plug-ins, another for security roles etc. With a vertical split on solutions we may group them into functional areas. Maybe with one shared common / base solution and then separate solutions for each key business area.
When we consider which sub components to add into a solution it is considered good practice to only include the required components. Avoiding all the sub components of an entity or all its metadata unless you are making changes to all the components. Including only the changes components will help make deployment of solutions more manageable and will help reduce the chance of unnecessary dependencies.
Solution aware code assets should be built within a build environment. And not in the developers desktop! Code assets will include plug-ins, form script and Power Apps component framework components. After the build they should be deployed to the master environment and will then be exported into the master solution.
The ALM plan must also consider how to manage any components that will sit outside the Power Platform environment which maybe still be “environment aware”. These could include Power BI visualizations, Azure deployed components and external; integration services.
We may also need to migrate a common set of configuration data from environment to environment. The Configuration Migration Tool can help move data between environments. Importantly it can maintain a common primary record identifier (GUID) for this data. One good example of this which I have commonly experienced is Unified Service Desk (USD). Our USD config is simply data used to describe how the user interface should behave. The USD config can be moved using the configuration migration tool.
Environment variables may also be used and tracked as solution components. The environment variables can be of types decimal, JSON, test and two options. Each can have a default value and a current environment value. Apps, Power Automate and developer code can retrieve and modify the values of environment variables.
Azure DevOps (formerly known as Visual Studio Team Services) provides development and collaboration tools. Including Azure Boards, Pipelines, Repos, Test plans and artefacts.
- Azure Boards – plan, track, and discuss work across your teams
Azure Pipelines – Use to automate CI/CD builds and releases
- Build pipelines are used to create dev environment, commit changes from dev to source control, check solutions and automate testing
- Release pipelines are used to migration solutions from build pipelines into test/prod.
- Azure Repos – Source Control to store and track changes
- Azure Test Plans – Plan, execute, and track scripted tests
- Azure Artefacts – publish solutions built by build pipelines
Azure DevOps is not the only tool available to us! CDS / admin APIs, direct Power Shell could be used instead for build tasks. Or Power Automate can be used with platform admin connectors to automate deployment tasks.