MB6-898 Provide examples for uses of task management
MSDW Podcast: What's next in Microsoft Dynamics 365 Portal, with Nicholas Hayduk
The exciting, intriguing, and mundane updates promised for Portal in the April to September 2018 release wave.
Connecting Azure DevOps with Lifecycle Services for Release piplelines
I wanted to briefly add some additional thoughts as well, some considerations of my own, while we wait for the official documentation.
This new feature is very quick and easy to setup, and is something everyone should adopt sooner than later. It shaves off the time spent downloading the complete build artifact somewhere and then uploading it to the Dynamics Lifecycle Services Project Asset Library. After a successful build of the full application you will get the package automatically "released" and uploaded to the asset library.
We expect more "tasks" to be added, allowing us to setup a release pipeline that also lets us automatically install a successful build on a designated target environment. So getting this setup now, and have the connection working, will set the stage for adding the next features as they are announced.
Here are some of the requirements and considerations while you set this up:
- You need to register an Azure Application of type Native (public, multi-tenant). While it is said you can use the preview experience to register the app in the Azure Portal, I had to use the "non-preview" experience, to ensure I got a correct Native Azure app registration, and not a Web app. While you can add the necessary permissions setup (user_impersonation), you need to run the admin consent for the permissions to work. If you are setting it up, and you are not Global Admin or Application Admin, then you will need to get someone else with necessary permissions to run the admin consent part.
- The connection also requires user credentials as part of the setup. This should not be a just any user, if you think about it. You don't want the connection to break just because the password was changed, or the user was disabled. Also multi-factor (or two-factor) authentication will not work here. So you might want to create yourself a dedicated user for this connection. The user does not need any licenses, just a normal cloud user you have setup and logged on with at least once. Also the user needs to be added to the Lifecycle Services project with at least Team member permissions (access to upload to Asset Library). Log on to LCS with the user once and verify access.
- When you create the release pipeline for the first time, you will need to install the Azure DevOps task extension. Search for "Dynamics" and you should find the "Dynamics 365 Unified Operations Tools". If you are doing the setup with a user that has access to many different Azure DevOps organizations (ie. you're the partner developer doing this cross multiple customers), you will need to make sure you install it on the correct Azure DevOps Organization. When it is installed, you will have to refresh the task selection to see the new available task(s), including the new task named "Dynamics Lifecycle Services (LCS) Asset Upload".
- When configuring the task, you will want to use variables to replace parts of the strings, like the name of the asset, the description, and so on. When you run a release, one of the steps actually list out all the available variables, though with a slightly different syntax. Have a look at the list of available variables on this article, plus the top on how to see the values they are populated with upon a run.
- AADSTS65001: The user or administrator has not consented to use the application with ID '***' named 'NAME_OF_APP'. Send an interactive authorization request for this user and resource.
Here you have not successfully run the admin consent. Someone with proper permissions needs to run the "Grant permissions" in the Azure Portal. - AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access.
This is most likely because the user credentials used on the connection is secured with multi-factor authentication. Either use a different account without MFA, or disable it. Most likely it is on for the account for a reason.
Adding Value to Microsoft Dynamics 365 with Document Generation
Automation has always been a popular concept. Once a staple of science fiction lore, it’s now become part of modern daily lives. One example is office automation which began back in the day with basic document management. It wasn’t long ago that everyone used typewriters and filing cabinets to manage their office documents, until PCs emerged in the 1980s. Microsoft Word eventually became the norm, followed by landmark products Microsoft Office in 1990, Office 2003, and eventually more sophisticated present-day technology like Dynamics 365.
The Case for Document Generation
There’s no question that a CRM can help your company with sales, marketing, customer service, and more but what about document generation? This is where Xpertdoc comes into play. When integrated with CRM, Xpertdoc fully automates the processes for the generation, management, storage and delivery of business documents. Not only can you design document templates but business users can visually model and deploy document workflows leveraging data from any available source (like Dynamics 365), addressing the most complex document scenarios while eliminating the need for technical knowledge or coding skills.
Generating documents in 4 easy steps
The very first step of the document generation process is connecting your data source like MS Dynamics 365 for example. Once that is done, you’ll need to create a data set. Not to worry, it’s not as technical as it sounds. A data set is just a collection of business data in a database. Xpertdoc Smart Flows for Dynamics 365 features an updated Data Set Builder with a modern user interface and easy navigation, i.e. no programming skills required.
Now that you’ve got your data set, it’s time to design your template using our Microsoft Word-based Template Designer (check out a previous blog post on best practices for template design) The ability to combine data sets from multiple sources into a single template allows you to tie up loose ends in one fell swoop. You decide to create customized templates for different regions, which your sales force can access via a dashboard. With your customized template, you can auto-fill information for your repeat customers, which saves time and creates a more efficient customer experience (CX). Once your template’s in place, you’re all set to run a document flow.
There are more wide-reaching benefits of a document generation system, such as content search capabilities, e-signature support and more. You can archive templates, creating a personal library of enterprise information. The Single Sign-on authentication feature of Xpertdoc Smart Flows eliminates the need to remember passwords, providing a fast and easy connection. And speaking of connecting, you can enjoy flexible deployment options, either onsite or in a public or private cloud.
A modern document generation tool like Xpertdoc provides more automated capabilities and a competitive business advantage than old fashioned document management.
To learn more about Xpertdoc Smart Flows for Microsoft Dynamics 365, request a free 30-day trial and follow us on Twitter: @xpertdoc or visit www.xpertdoc.com for more information.
The post Adding Value to Microsoft Dynamics 365 with Document Generation appeared first on CRM Software Blog | Dynamics 365.
The magic of reflow, or the story of two grids
How-to: Decompile your .app file
The other day we found out that some part of one of our extensions was not uploaded to our source code repository and no local version was available anymore. But we did have the .app file. I knew from extensions version 1.0 that the extension file, the .navx, was a zipped set of files. The x in .navx is pointing at this is with all office file extensions now a days. You might recall one of my old posts. As per this blog post, but then in reverse order, would I be able to unzip the .app file by changing the file extension to .zip? And subsequently use the Windows Extract All feature? Apparently not:
So, my lazy mind thought: it can't be done this way, and I asked on one of the fora I am whether there is another way of decompiling a .app file. Of course, on the Extension Management window there is the feature Download Source:
Indeed this gives you a nicely zipped file set:
But what if I really only have the .app file an nothing more? Bruno Leisibach and Peter Sørensen helped me out. Thanx, guys.
The .app file is indeed a zipped set of files. Whereas my simple trick did not work, just use a program like 7-zip and extract the files. Note however that the format of the files and the project structure is not exactly the same as when using Download Source. The app.json is now a NavxManifest,.xml (where do we know this from?).
Note
Download Source is somewhat differently implemented on NAV 2018 and Business Central:
- On NAV 2018 you can download the source if the extension has been installed on a specific tenant
- On Business Central it will only work when showMyCode is set to true in your extension and the extension has been built with target set to Internal.
Remove Microsoft Account From A PC
The steps to remove/unlink a Microsoft Account, including associated Microsoft Store, from a PC are actually quite simple, but I had to do a search to find them. Writing them down will help me to remember in future.
To unlink a Microsoft Account, open the Manage your account window from the Windows Start menu:

Click on the Email & app accounts tab in the navigation pane (these first two steps can be simplified by typing ms-settings:emailandaccounts in a Run prompt).
Click the Microsoft Account to remove and, when this section opens, click the Remove button:
When the popup confirmation box appears, click Yes to proceed:

Are you sure you want to remove this account? This will remove email and all content associated with it. Your organization might also remvoe some data stored on this device.
There are no second changes; once you’ve clicked Yes on the previous dialog, the account will be removed:
You can add the account back if required, or another Microsoft Account, by clicking the Add a Microsoft account link.
Read original post Remove Microsoft Account From A PC at azurecurve|Ramblings of a Dynamics GP Consultant
5 Significant Ways User Adoption Monitor Helps Augment Your Business within Dynamics 365 CRM
{Solved} Object reference error while connecting to Dynamics CRM Plugin Registration tool
Options in Business process flow
How to create a Business Process Flow (BPF).
Some Important Facts About BPF
Whats Relationship in BPF?
Action Step in BPF
Calling Workflow In BPF
Retrieve User assigned Roles with user information using Query Expression , C# in Dynamics 365.
Dynamics 365 for Marketing Deployment challenges
Dynamics 365 Saturday Scotland
DateTimes – it’s never the last word!
Way back in 2011 I blogged about the behaviour of DateTimes in Dynamics CRM (as it was then!). I titled the post 'the last word?' but of course, it's never the last word when it comes to a technology that is always moving forward.
This post aims to explain where we are today with Date & Times fields inside the Common Data Service for Applications (CDS) and PowerApps.
User Local vs. Time Zone Independent
In my previous post, I described the challenges of storing absolute dates such as dates of birth. These dates don't change depending on which timezone you are in. Since then, the PowerPlatform now supports 'Time Zone Independent' dates that will always show the date that they are entered as.
If you choose DateTime as the field type you can then select from 3 'behaviours':
This table summarises the differences between these 3 behaviours:
Field Time | Behaviour | Affected by User Time Zone in PowerApps? | Time Stored in CDS? | CDS WebApi Read/Write uses time zone? | Can be change once set? |
Date | User Local | ✅* | ✅ The time element is set to 00:00 minus the user's time zone offset. | ❌Always UTC | ✅ Can change to Date Only or Time zone Independent |
Date | Date Only | ❌ | ❌ | ❌ | ❌ |
Date | Time zone independent | ❌ | ✅ Always 00:00 irrespective of time zone | ❌ | ❌ |
Date & Time | Time zone independent | ❌ | ✅ Time is set to whatever is entered by the user with no adjustments. | ❌ | ❌ |
Date & Time | User Local | ✅* | ✅ The time element is set to time entered minus the user's time zone offset. | ❌ Always UTC | ✅ Can change to Time zone Independent only |
*Model Driven Apps use the user's time zone settings. Canvas Apps use the local machine's time zone.
What's the difference between Date (Date Only) and Date (Time zone Independent)?
Given that Date fields should not show a time, why then do we have both a Date Only and Time Zone Independent behaviour for these types of fields. It's not clear why there is a distinction, but the effect is that web service only returns the Date element for Date (Date Only) fields and for Date (Time Zone independent) fields 00:00 is always returned irrespective of the time zone.
In a model-driven app the fields look like:
The WebApi returns 00:00:00Z for the Time zone independent field but not the Date Only field. The formatted values are however identical!
I can't think of any reason why this might be useful other than if there some client behaviour that couldn't deal with date only fields and always needed a time element.
Date Time (User Local) Field Example:
Here is a worked example of the standard behaviour in Date Time User Local fields:
Calculation | Worked Example | |
Time Zone Offset User 1 | UTC +10:00 (Brisbane) | |
Time Zone Offset User 2 | UTC -10:00 (Hawaii) | |
Time Entered by User 1 | 20-Jan 14:00 | |
Stored in CDS as UTC | − | 20-Jan 04:00 (14:00-10:00 = 4:00) |
Shown in App to User 2 | − + | 19-Jan 18:30 (14:00 - 10:00 + (-10:00) = 18:00) |
Notice how user 2 sees the date as 19th Jan even though user 1 entered it as 20th Jan.
Date Only (User Local) Field Example:
For Date only User Local fields, the behaviour is the same except the time is set to 00:00 when entering the date. Here is a worked example:
Calculation | Worked Example | |
Time Zone Offset User 1 | UTC +10:00 (Brisbane) | |
Time Zone Offset User 2 | UTC -10:00 (Hawaii) | |
Time Entered by User 1 | 20-Jan-19 00:00 | |
Stored in CDS as UTC | − | 19-Jan 04:00 (00:00-10:00 = 14:00) |
Shown in App to User 2 | − + | 19-Jan 04:00 (00:00 - 10:00 + (-10:00) = 04:00) |
Notice here that even though the field is set to Date only it is still affected by the local user's time zone and so the Date shows as the 19th for User 2.
All other field types
For Time zone independent and Date only fields the calculations are simple – the date time returned is the same as entered irrespective of time zone.
Calculation | Worked Example | |
Time Zone Offset User 1 | UTC +10:00 (Brisbane) | |
Time Zone Offset User 2 | UTC -10:00 (Hawaii) | |
Time Entered by User 1 | 20-Jan-19 14:00 | |
Stored in CDS the same as entered | 20-Jan-19 14:00 | |
Shown in App to User 2 | 20-Jan-19 14:00 |
Model Driven Apps
The behaviour in Model Driven Apps in the UI is simple as shown below (in the same order as the table above).
Canvas Apps
If you build a Canvas app that includes these fields it will look like:
Current issues with the CDS Connector for Canvas Apps:
- There is an issue with the Date only User Local behaviour where it shows the time element.
- The formatting of the dates will not honour the formatting of the user in their CDS user settings. You will need to manually handle formatting using the CanvasApps field formatting:
- The DateTimeZone.Local will use the user's local machine's time zone rather than their CDS user settings time zone and so currently you'll need to manually compensate for this since it could lead to a different date/time being shown in the Model Driven App compared to the Canvas App if the two time zones are different.
These issues will be fixed in a future release of the CDS connector.
WebApi Date Times
When you query, create or update date time fields using the WebApi, remember to always set the value in UTC and compensate for any time zone offsets manually since it will not use the user's time zone at all.
Changing Behaviour
As you can see in the table above, if you have User Local Fields you can choose to change to Date only or Time Zone independent fields which is a one-way process. This does not affect the current values in the database (which will be UTC). New fields will correctly be stored, but you may find that existing values will now show incorrectly because they will be the UTC value original stored in the database. To correct this, you will need to write a conversion program using the ConvertDateAndTimeBehaviorRequest message.
You can find a sample written in c# to change the behaviour here- https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/org-service/sample-convert-date-time-behavior
Important: There is a cautionary note here in that you must open and re-save any workflows, business rules, calculated field and rollup field after changing the behaviour of the field.
Read more
There is good documentation on the Common Data Service DateTime fields at https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/behavior-format-date-time-field.
Information about changing date time behaviour - https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/behavior-format-date-time-attribute#convert-behavior-of-existing-date-and-time-values-in-the-database