When can I finally renew my Microsoft certification? - I’m certainly not alone with this or similar questions and the associated uncertainty. Okay, a certain impatience certainly resonates as well. After all, I would also like to schedule it into my daily routine. But how?
How do I best prepare for a Microsoft certification? - this or a similar question is asked by everyone who wants to deal with the topics Microsoft, Azure, Microsoft 365, Power Platform or Dynamics 365. In this article, I would like to go into the possibilities that Microsoft offers us for preparation.
Hello there, I’m Martin, software architect and developer from the Cologne/Bonn area. Right from the start of my professional career, I decided in favor of .NET and Microsoft technologies and tools and have always incorporated them into my work. With more than 15 years of experience in the field of software architecture and development with .NET, my focus is particularly on increasing the quality and performance of development teams, the interaction of the software solution with the target environment and the actual application down to the last byte.
In my position as Director Consulting Services @ CGI
, I act as enterprise architect and developer for cloud native and .NET solutions. I am also a trainer for cloud and software architecture. In addition to my professional life, I am involved in the open source communities and currently provide them with various NuGet packages with different focuses and functionalities.
A strong willingness to learn and develop is also part of my everyday life. This was taken to a new level for me in 2021 after I successfully completed my IHK trainer and my Microsoft certified trainer this year. In addition, I was able to qualify as a trainer for CGI’s Risk and Cost Driven Architecture
-program in 2022.
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
Embark on a journey through Microsoft’s redesigned certification exam UI. Discover streamlined navigation, enhanced accessibility, and personalized experiences, revolutionizing the exam-taking experience.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
Whatever our role, be it developer, IT professional or architect, we try to avoid technical debt. If this is not possible from the outset, or if we decide to accept this technical debt for a limited period of time, we usually lack the tools to do so. This is where this article may help.
In software development, dependencies are inevitable - any project worth its salt relies on various libraries, frameworks, or packages. However, as I found in my own work, managing these dependencies can be an onerous task. Constant updates, new vulnerabilities, and endless manual approvals were draining my time and focus. What if, I thought, these processes could be automated? This thought led to the creation of dependamerge, a GitHub Action designed to free developers from the drudgery of manual dependency maintenance and let us get back to what we do best: building great software.
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
Embark on a journey through Microsoft’s redesigned certification exam UI. Discover streamlined navigation, enhanced accessibility, and personalized experiences, revolutionizing the exam-taking experience.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
Embark on a journey through Microsoft’s redesigned certification exam UI. Discover streamlined navigation, enhanced accessibility, and personalized experiences, revolutionizing the exam-taking experience.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
Whatever our role, be it developer, IT professional or architect, we try to avoid technical debt. If this is not possible from the outset, or if we decide to accept this technical debt for a limited period of time, we usually lack the tools to do so. This is where this article may help.
In software development, dependencies are inevitable - any project worth its salt relies on various libraries, frameworks, or packages. However, as I found in my own work, managing these dependencies can be an onerous task. Constant updates, new vulnerabilities, and endless manual approvals were draining my time and focus. What if, I thought, these processes could be automated? This thought led to the creation of dependamerge, a GitHub Action designed to free developers from the drudgery of manual dependency maintenance and let us get back to what we do best: building great software.
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
Embark on a journey through Microsoft’s redesigned certification exam UI. Discover streamlined navigation, enhanced accessibility, and personalized experiences, revolutionizing the exam-taking experience.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
When can I finally renew my Microsoft certification? - I’m certainly not alone with this or similar questions and the associated uncertainty. Okay, a certain impatience certainly resonates as well. After all, I would also like to schedule it into my daily routine. But how?
How do I best prepare for a Microsoft certification? - this or a similar question is asked by everyone who wants to deal with the topics Microsoft, Azure, Microsoft 365, Power Platform or Dynamics 365. In this article, I would like to go into the possibilities that Microsoft offers us for preparation.
Whatever our role, be it developer, IT professional or architect, we try to avoid technical debt. If this is not possible from the outset, or if we decide to accept this technical debt for a limited period of time, we usually lack the tools to do so. This is where this article may help.
When can I finally renew my Microsoft certification? - I’m certainly not alone with this or similar questions and the associated uncertainty. Okay, a certain impatience certainly resonates as well. After all, I would also like to schedule it into my daily routine. But how?
How do I best prepare for a Microsoft certification? - this or a similar question is asked by everyone who wants to deal with the topics Microsoft, Azure, Microsoft 365, Power Platform or Dynamics 365. In this article, I would like to go into the possibilities that Microsoft offers us for preparation.
Mastering .NET Project Properties: The BuildingInsideVisualStudio Flag
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
The BuildingInsideVisualStudio property is a conditional flag that can be used within your project files (.csproj) to apply certain settings or include/exclude packages and references based on whether the project is being built inside Visual Studio. This property is particularly useful when you need to differentiate between builds triggered from Visual Studio and those triggered from other environments such as command-line builds or CI/CD pipelines.
Let’s start with a practical example: adding a package reference only when the project is being built inside Visual Studio. This can be useful when you want to include certain tools or analyzers only in the development environment to keep the build lean for production.
Assuming you want to add a reference to SonarAnalyzer.CSharp, a popular static code analysis tool, but only when building the project within Visual Studio, you can use the BuildingInsideVisualStudio property to conditionally include this package reference in your .csproj file. Why would you want to do this? It’s already included in your CI/CD pipeline, so you don’t need it in your local development environment? The answer is simple: you want to have the same code analysis rules and hints in your local development environment as in your CI/CD pipeline. This way, you can fix issues early and avoid surprises when pushing your code to the repository, and executing maybe long-running CI/CD pipelines.
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
Embark on a journey through Microsoft’s redesigned certification exam UI. Discover streamlined navigation, enhanced accessibility, and personalized experiences, revolutionizing the exam-taking experience. Next
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
In software development, dependencies are inevitable - any project worth its salt relies on various libraries, frameworks, or packages. However, as I found in my own work, managing these dependencies can be an onerous task. Constant updates, new vulnerabilities, and endless manual approvals were draining my time and focus. What if, I thought, these processes could be automated? This thought led to the creation of dependamerge, a GitHub Action designed to free developers from the drudgery of manual dependency maintenance and let us get back to what we do best: building great software.
Master dependency management with automation: story behind dependamerge
In software development, dependencies are inevitable - any project worth its salt relies on various libraries, frameworks, or packages. However, as I found in my own work, managing these dependencies can be an onerous task. Constant updates, new vulnerabilities, and endless manual approvals were draining my time and focus. What if, I thought, these processes could be automated? This thought led to the creation of dependamerge, a GitHub Action designed to free developers from the drudgery of manual dependency maintenance and let us get back to what we do best: building great software.
Like many developers, I used to spend a lot of time managing dependencies. Dependabot would helpfully create pull requests for each new release, but I still had to check and merge each one. This quickly became an endless cycle. The hassle of checking every dependency update, even minor ones, pulled me away from critical tasks.
The reality is that as teams grow in size, dependency management becomes increasingly complex. For a while, I was stuck in a manual cycle, balancing the risk of out-of-date dependencies against the time commitment of updates. This tension was a big factor that inspired dependamerge.
My experience echoed the frustrations faced by many developers:
Unending maintenance: Keeping up with dependency updates is like an unrelenting treadmill. Without automation, it’s all too easy for obsolete packages to slip through the cracks.
Disrupted flow: Each pull request interrupts the flow, forcing us to context-switch and potentially delaying real progress.
Security pressure: At a time when vulnerabilities can bring down entire ecosystems, dependency maintenance is non-negotiable, but finding the time to do it can feel impossible.
Productivity drain: Manual dependency management is a time sink, diverting focus from the core work of building and improving software.
Technical debt: Neglected dependencies can accumulate into a significant technical debt, leading to more problems down the line.
Automating dependency management with dependamerge brings a range of significant benefits that streamline development and enhance code quality:
Time-Saving: By automating dependency updates, dependamerge saves developers from manually reviewing each pull request. This efficiency frees up hours each week, allowing teams to focus on feature development and innovation rather than getting bogged down by routine maintenance.
Enhanced Security: In today’s landscape, where vulnerabilities can have far-reaching impacts, timely updates are essential for maintaining a secure codebase. With dependamerge, critical updates can be applied promptly and consistently, helping to protect your projects from potential threats. Automation ensures that nothing slips through the cracks, even when time is limited.
Improved Code Quality and Stability: Automated dependency updates reduce the risk of errors that can occur when manually merging changes across environments. Consistent updates prevent compatibility issues that might arise from neglected dependencies, contributing to a more stable and reliable codebase.
Reduced Technical Debt: By keeping dependencies up-to-date, dependamerge helps prevent the buildup of technical debt that can slow down future development and create unexpected blockers. With fewer outdated dependencies, teams can avoid the last-minute scramble to upgrade critical packages or dependencies right before a major release.
Seamless Integration in CI/CD Workflows: dependamerge is designed to operate smoothly within a CI/CD pipeline, allowing dependency updates to be tested and validated alongside other code changes. This integration reduces interruptions to the workflow and ensures that updates don’t introduce issues at later stages in the development lifecycle.
By automating these repetitive tasks, dependamerge empowers developers to focus on what matters most: building and improving software. It’s a tool that boosts productivity, enhances security, and ultimately contributes to a more efficient and resilient development process.
Designed to take the reins of dependency updates, dependamerge works with Dependabot to make dependency management truly seamless. This GitHub action doesn’t just approve updates—it is adjustable to your project’s specific needs, ensuring that only the right updates are merged at the right time. Even better, dependamerge can be part of a fully automated CI/CD pipeline, ensuring that dependency updates are tested and validated alongside other code changes.
Highlights of dependamerge include:
Fully compatible with Dependabot: dependamerge works seamlessly with Dependabot, extending its capabilities and streamlining the update process. To do this, dependamerge communicates with Dependabot’s comment commands to manage the pull requests.
Automated merging: With the ability to define specific merge rules, updates are approved without disrupting your day. Regardless of the ecosystem, all current and future Dependabot ecosystems are supported.
Customizable conditions: Tailor the automation to prioritize critical updates, such as security patches, while handling non-critical updates according to your project’s needs.
Human-Free Handling: Freeing developers from dependency maintenance not only saves time, but also prevents mental fatigue from routine tasks. dependamerge ensures that updates are handled consistently and reliably, without manual intervention.
To start with dependamerge, you can use the following example configuration. This GitHub action is highly customizable, allowing you to adjust various parameters to suit your project’s specific requirements.
name:DependaMerge
+
+on:
+pull_request:
+
+jobs:
+dependabot:
+runs-on:ubuntu-latest
+steps:
+- uses:actions/checkout@v2
+
+- name:DependaMerge
+uses:dailydevops/dependamerge-action@v1
+with:
+token:${{ secrets.GITHUB_TOKEN }}
+command:squash# Merge all commits into one (default)
+
The output parameters in dependamerge provide a valuable summary of each action’s status and results, allowing you to programmatically react based on outcomes. Two key outputs include:
state: Indicates the action’s status, including:
approved: Pull request was successfully approved.
merged: Pull request was merged.
skipped: Action skipped the pull request, halting further processing.
failed: Action couldn’t process the pull request due to errors.
rebased: PR was rebased due to behind-branch status.
Benefit: By checking the state output, your workflow can respond to each action outcome. For example, you could add conditional notifications for failed or skipped updates to ensure immediate attention or skip further testing if the pull request was already merged.
message: Contains additional information on the processing state, including error and debug details.
Benefit: The message output parameter can be leveraged for logging purposes or sent in a notification, enabling better tracking and diagnostics without requiring manual review. It’s especially useful for troubleshooting and ensuring full transparency of the automation process.
These output parameters add an essential layer of feedback, enabling automated downstream workflows based on dependamerge outcomes. The increased control and visibility improve overall workflow reliability and responsiveness.
Designed to take the reins of dependency updates, dependamerge works with Dependabot to make dependency management truly seamless. This GitHub action doesn’t just approve updates—it is adjustable to your project’s specific needs, ensuring that only the right updates are merged at the right time. Even better, dependamerge can be part of a fully automated CI/CD pipeline, ensuring that dependency updates are tested and validated alongside other code changes.
dependamerge thrives on community input. Whether you’re a developer, or user, your feedback and contributions are invaluable. By sharing your experiences, suggesting improvements, or submitting code, you can help shape the future of dependamerge. Every contribution, no matter how small, makes a difference in creating a more efficient and effective dependency management solution for all. - dailydevops/dependamerge-action
Whether you’re working on a private project, an open-source initiative, or a company-driven application, dependamerge is designed to meet your needs. By automating dependency management, you can focus on building great software without the burden of manual updates. The flexibility and customization options in dependamerge ensure that you can tailor the automation to your project’s specific requirements, making it a valuable addition to any development workflow.
If you’re like me, frustrated by dependency management’s time-consuming nature, dependamerge is the solution you’ve been waiting for. Try it out, contribute, and help shape the future of dependency management automation. Together, we can build a more efficient, secure, and productive development process for all.
+
\ No newline at end of file
diff --git a/posts/how-to-prepare-for-microsoft-certification/index.html b/posts/how-to-prepare-for-microsoft-certification/index.html
index e4823ded..df26ef5d 100644
--- a/posts/how-to-prepare-for-microsoft-certification/index.html
+++ b/posts/how-to-prepare-for-microsoft-certification/index.html
@@ -1,5 +1,5 @@
How to Prepare for Microsoft Certification — Daily DevOps & .NET
-
How do I best prepare for a Microsoft certification? - this or a similar question is asked by everyone who wants to deal with the topics Microsoft, Azure, Microsoft 365, Power Platform or Dynamics 365. In this article, I would like to go into the possibilities that Microsoft offers us for preparation.
Regardless of whether you’re new to the subject or already know it, preparing for a potential exam is often a challenge. But first, let’s clarify which exams and certifications are available in the first place.
Microsoft categorizes its entire Certification Portfolio
by category and level. The following categories are currently provided by Microsoft:
Azure
Dynamics 365
Microsoft 365
Power Platform
Security, Compliance and Identity
As well as in the levels Fundamentals, Role-based and Specialty. This is very clearly presented in the overview (aka.ms/TrainCertPoster
) and is regularly updated by Microsoft. In addition, Microsoft offers a second, much more detailed overview (aka.ms/TrainCertDeck
diff --git a/posts/illuminate-technical-debt/index.html b/posts/illuminate-technical-debt/index.html
index 515cbda2..34838eef 100644
--- a/posts/illuminate-technical-debt/index.html
+++ b/posts/illuminate-technical-debt/index.html
@@ -1,6 +1,6 @@
Whatever our role, be it developer, IT professional or architect, we try to avoid technical debt. If this is not possible from the outset, or if we decide to accept this technical debt for a limited period of time, we usually lack the tools to do so. This is where this article may help.
Technical debt is a metaphor used to describe the costs and risks incurred as a result of decisions or omissions. It is important to note that this metaphor can be applied to all types of technical debt.
First, there is architectural debt, which is usually based on a decision made by an individual architect or group of architects. Then there is implementation debt, which is probably the most common in most projects, as it is also identified through source code analysis. And then there is the test and documentation debt, which is far too often neglected.
Whatever the type of technical debt, the common denominator is that it tends to cause problems in projects and later in operations. In July 2011, Phillipe Kruchten described them as “invisible negative elements in the backlog”.
In most projects, it is individuals or a small group of individuals who are aware of individual Technical Debts. However, these projects usually have another thing in common: when these technical debts are addressed, they are postponed or even dismissed.
To avoid this, Technical Debts need to be tracked in the same way as requirements or defects. All you need is a person with administrative rights in Azure DevOps or comparable platforms.
Whatever our role, be it developer, IT professional or architect, we try to avoid technical debt. If this is not possible from the outset, or if we decide to accept this technical debt for a limited period of time, we usually lack the tools to do so. This is where this article may help.
Technical debt is a metaphor used to describe the costs and risks incurred as a result of decisions or omissions. It is important to note that this metaphor can be applied to all types of technical debt.
First, there is architectural debt, which is usually based on a decision made by an individual architect or group of architects. Then there is implementation debt, which is probably the most common in most projects, as it is also identified through source code analysis. And then there is the test and documentation debt, which is far too often neglected.
Whatever the type of technical debt, the common denominator is that it tends to cause problems in projects and later in operations. In July 2011, Phillipe Kruchten described them as “invisible negative elements in the backlog”.
In most projects, it is individuals or a small group of individuals who are aware of individual Technical Debts. However, these projects usually have another thing in common: when these technical debts are addressed, they are postponed or even dismissed.
To avoid this, Technical Debts need to be tracked in the same way as requirements or defects. All you need is a person with administrative rights in Azure DevOps or comparable platforms.
In this case, the extended process templates AgileRCDA and ScrumRCDA were simply extended by an additional WorkItem type, which will be used in the future to record and visualise technical debt. In 2011, Kruchten already used the colour black for the colour scheme of technical debt.
For later prioritisation and sorting, it is advisable to pass additional parameters to the WorkItem type, such as
This creates the technical foundation based on the process templates, and within the project only the technical debt type work items need to be recorded.
The Azure DevOps extension (or alternative platforms) presented here takes only a few minutes to extend and deploy. But it will have the desired effect by the next sprint meeting. That’s because the black work items of the “technical debt” type quickly give the impression of a tombstone and provide the necessary visibility.
Don’t be surprised if the tombstones start to pile up after a few weeks. Your colleagues and team members know about other Technical Debts that you probably haven’t noticed yet.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
In software development, dependencies are inevitable - any project worth its salt relies on various libraries, frameworks, or packages. However, as I found in my own work, managing these dependencies can be an onerous task. Constant updates, new vulnerabilities, and endless manual approvals were draining my time and focus. What if, I thought, these processes could be automated? This thought led to the creation of dependamerge, a GitHub Action designed to free developers from the drudgery of manual dependency maintenance and let us get back to what we do best: building great software.
When can I finally renew my Microsoft certification? - I’m certainly not alone with this or similar questions and the associated uncertainty. Okay, a certain impatience certainly resonates as well. After all, I would also like to schedule it into my daily routine. But how?
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
Embark on a journey through Microsoft’s redesigned certification exam UI. Discover streamlined navigation, enhanced accessibility, and personalized experiences, revolutionizing the exam-taking experience.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
Whatever our role, be it developer, IT professional or architect, we try to avoid technical debt. If this is not possible from the outset, or if we decide to accept this technical debt for a limited period of time, we usually lack the tools to do so. This is where this article may help.
In software development, dependencies are inevitable - any project worth its salt relies on various libraries, frameworks, or packages. However, as I found in my own work, managing these dependencies can be an onerous task. Constant updates, new vulnerabilities, and endless manual approvals were draining my time and focus. What if, I thought, these processes could be automated? This thought led to the creation of dependamerge, a GitHub Action designed to free developers from the drudgery of manual dependency maintenance and let us get back to what we do best: building great software.
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
Embark on a journey through Microsoft’s redesigned certification exam UI. Discover streamlined navigation, enhanced accessibility, and personalized experiences, revolutionizing the exam-taking experience.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
Gradually Introducing Nullability in Legacy Code: A Practical Guide for .NET and C#
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
Nullability annotations in C# allow us to specify whether a reference type can be null or not. This feature brings more type safety and reliability to your code, reducing the chance of runtime errors caused by null values. But here’s the challenge: introducing nullability into an existing, possibly large codebase and no clear code style, where methods and properties might be riddled with potential nulls, can result in an overwhelming number of compiler warnings. In such cases, it’s easy to give up on nullability altogether, leaving your codebase vulnerable to null reference exceptions. But it doesn’t have to be that way.
To address this, you can take an incremental approach. Rather than trying to make your entire codebase null-safe in one go, you can introduce nullability step by step—starting with new code and gradually refactoring old code. This method minimizes disruption and lets your team handle the transition without being flooded with warnings.
In modern C#, a string is assumed not nullable by default, meaning it cannot contain a null value without a compiler warning. However, you can explicitly declare a string as nullable by using the string? syntax. Here’s an example:
stringnonNullableString="Hello";// Can't be null, compiler will warnstring?nullableString=null;// Can be null, no compiler warning
The nullable string? type indicates that the variable may contain a null value, while the non-nullable string enforces that null values are not allowed.
The beauty of nullable reference types is that they make your intent clear, and C#’s compiler will help enforce this through warnings whenever there’s a potential for null dereferencing. And there is a huge list of possible nullable warnings, see Nullable reference types warnings
@@ -52,7 +52,7 @@
-
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
Embark on a journey through Microsoft’s redesigned certification exam UI. Discover streamlined navigation, enhanced accessibility, and personalized experiences, revolutionizing the exam-taking experience.Previous
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
In software development, dependencies are inevitable - any project worth its salt relies on various libraries, frameworks, or packages. However, as I found in my own work, managing these dependencies can be an onerous task. Constant updates, new vulnerabilities, and endless manual approvals were draining my time and focus. What if, I thought, these processes could be automated? This thought led to the creation of dependamerge, a GitHub Action designed to free developers from the drudgery of manual dependency maintenance and let us get back to what we do best: building great software.
For over 12 years, NuGet package management has been part of the .NET ecosystem with direct integrations to various IDEs, CLIs and build systems. But a feature took 12 years before it appeared and certainly needs some more maintenance until it is mature!
Regardless of the code version management strategy, mono-repository vs. poly-repository, there has always been a need to synchronize the individual projects in the versions of NuGet packages used. Reasons for this are compatibility and security, but also new functionalities or bug fixes.
Over the years, the requirements in this area have evolved more and more, so that the previous solution approaches increasingly reached their limits. Not only the uniform use of the same package version, but also the general use of a package in all related projects of a solution was taken up and developed further in this context. However, the main shortcoming could never be solved; until now, manual intervention by a developer was always necessary to update the version of the packages used. The existing integrations of IDEs and CLIs produced more errors than they could fix.
Now the request has been fulfilled and in April 2022 the Central Package Management (“CPM”)
was introduced and released along with NuGet version 6.2 and some complementary features.
To enable central package management, the MSBuild property ManagePackageVersionsCentrally is set to true in the Directory.Packages.props file.
For version listing and management, PackageVersion elements are required, each containing the package name and the version to be used. The next step is to remove the Version attribute from all PackageReference elements in the project files. This migrates the solution and it will use the central package management from now on.
Setting the MSBuild property CentralPackageTransitivePinningEnabled to true tells NuGet to update all transitive dependencies from their explicitly defined dependencies. This property can be set in both Directory.Build.props and the aforementioned Directory.Packages.props.
Another feature is GlobalPackageReference, which can be used to reference a package in any project of the solution / repository, such as code analyzer. This kind of package referencing should also be done in Directory.Packages.props.
All in all, a great enhancement to the NuGet system. However, there are currently some issues with the Visual Studio or .NET CLI integration.
Both integrations are able to evaluate the package references and recover the packages. However, when updating with Visual Studio, the XML structure of the project is updated incorrectly, so manual rework is required.
When the .NET CLI wants to add a reference to a project, CPM is ignored and build errors occur again.
However, this should not deter you, because existing integrations such as GitHubs Dependabot
provide excellent results.
In software development, dependencies are inevitable - any project worth its salt relies on various libraries, frameworks, or packages. However, as I found in my own work, managing these dependencies can be an onerous task. Constant updates, new vulnerabilities, and endless manual approvals were draining my time and focus. What if, I thought, these processes could be automated? This thought led to the creation of dependamerge, a GitHub Action designed to free developers from the drudgery of manual dependency maintenance and let us get back to what we do best: building great software.
As developers, we’re often tasked with maintaining and modernizing legacy codebases that were written long before some of the best practices of today—such as nullability annotations—were available. While modern C# now supports nullable reference types, enabling us to avoid the dreaded NullReferenceException, introducing this feature to existing, large codebases can be a challenge.
In this article, I’ll share my step-by-step approach for introducing nullability into a legacy .NET and C# project. You’ll learn how to apply nullability in a controlled, incremental manner using project-level settings, scoped annotations, and file/method-level directives, all while maintaining the integrity of your legacy codebase. After all, modernizing your code doesn’t have to be an all-or-nothing endeavor—gradual change is key to a successful transition. Let’s get started!
In the ever-evolving world of .NET development, managing project configurations effectively is crucial for maintaining a clean and efficient build process. One of the less frequently discussed but highly useful properties is BuildingInsideVisualStudio. This property, when correctly utilized, can streamline your build process and ensure that your project is configured properly depending on the build environment. In this article, we’ll explore the BuildingInsideVisualStudio property with concrete examples and discuss best practices for using it effectively.
Reimagining the Microsoft Certification Exam UI Experience
In my experience, navigating through certification exams can sometimes feel like traversing a labyrinthine maze, especially when it comes to user interface (UI) experiences. However, Microsoft’s recent efforts in reimagining the UI for their certification exams are poised to revolutionize this journey. Let’s delve into how these changes are shaping the landscape of certification exams and what it means for aspiring professionals.
Gone are the days of clunky interfaces and confusing navigation. Microsoft’s new approach places a strong emphasis on user-centric design principles, making the exam-taking experience more intuitive and seamless. By incorporating feedback from a diverse range of users, Microsoft has tailored the UI to cater to the specific needs and preferences of exam takers.
One of the most notable improvements is the streamlined navigation system. The revamped UI provides clear signposts and intuitive pathways, allowing candidates to focus on showcasing their skills rather than getting lost in a maze of menus. Whether you’re accessing the exam on a desktop or mobile device, the navigation remains consistent and user-friendly.
Accessibility is at the forefront of Microsoft’s UI redesign efforts. The new interface is designed to be inclusive, ensuring that individuals with diverse abilities can navigate the exam with ease. Features such as customizable font sizes, screen reader compatibility, and keyboard shortcuts empower all candidates to demonstrate their knowledge without encountering unnecessary barriers.
Engagement is key to effective learning and assessment. Microsoft has introduced interactive elements within the exam UI to enhance the candidate experience. From drag-and-drop exercises to interactive simulations, these features simulate real-world scenarios, allowing candidates to demonstrate their practical skills in a dynamic environment.
No two candidates are alike, and Microsoft recognizes the importance of personalization in the learning journey. The new UI allows candidates to customize their exam experience based on their preferences and areas of expertise. Whether it’s adjusting the interface layout or accessing tailored resources, candidates have the flexibility to shape their exam journey according to their individual needs.
The redesigned UI seamlessly integrates with Microsoft Learn and other learning resources, providing candidates with easy access to study materials and practice exams. This integration fosters a cohesive learning experience, allowing candidates to bridge the gap between theory and practice effectively.
In conclusion, Microsoft’s reimagined certification exam UI experience marks a significant leap forward in the realm of professional development. By embracing user-centric design, streamlined navigation, enhanced accessibility, interactive elements, personalized experiences, and seamless integration with learning resources, Microsoft has set a new standard for certification exams. Aspiring professionals can now embark on their certification journey with confidence, knowing that the UI is designed to empower them every step of the way.
For those interested in exploring the redesigned UI firsthand, Microsoft offers a demo of the exam experience at http://aka.ms/examdemo
. Embrace the future of certification exams and unlock new opportunities for professional growth.
When can I finally renew my Microsoft certification? - I’m certainly not alone with this or similar questions and the associated uncertainty. Okay, a certain impatience certainly resonates as well. After all, I would also like to schedule it into my daily routine. But how?
How do I best prepare for a Microsoft certification? - this or a similar question is asked by everyone who wants to deal with the topics Microsoft, Azure, Microsoft 365, Power Platform or Dynamics 365. In this article, I would like to go into the possibilities that Microsoft offers us for preparation.
In software development, dependencies are inevitable - any project worth its salt relies on various libraries, frameworks, or packages. However, as I found in my own work, managing these dependencies can be an onerous task. Constant updates, new vulnerabilities, and endless manual approvals were draining my time and focus. What if, I thought, these processes could be automated? This thought led to the creation of dependamerge, a GitHub Action designed to free developers from the drudgery of manual dependency maintenance and let us get back to what we do best: building great software.
When Can I Finally Renew My Microsoft Certification
When can I finally renew my Microsoft certification? - I’m certainly not alone with this or similar questions and the associated uncertainty. Okay, a certain impatience certainly resonates as well. After all, I would also like to schedule it into my daily routine. But how?
The role-based and specialized Microsoft certifications in the areas of Azure, Dynamics 365, Microsoft 365, Power Platform, and Security, Compliance and Identity are valid for one year, with the exception of the Foundation certifications, which do not expire. However, since February 2021, Microsoft offers the possibility to renew these certifications for one year each, free of charge.
To do this, you will receive an email from Microsoft 6 months (i.e. exactly 180 days) before the certification expires with all the necessary information to extend the certificate free of charge.
But those who, like me, like to be prepared and planned for such issues, have to wait for the mail so far, it seems.
As always, a closer look at the URL structure will help the curious. You will notice that the existing structures like https://learn.microsoft.com/en-us/certifications/azure-solutions-architect/
only need to be extended by the path segment renew/. This addition will take us to a completely new page with extensive information about recertification, provided we are logged in with our Microsoft Learn account.
At first glance, we can see until when the certification is valid and how many days are left. Below that is the prompt:
If you have this certification and it will expire within six months, you are eligible to renew. Show that you have kept current with the latest Azure updates by passing the renewal assessment. You can also prepare to renew with the curated collection of learning modules.
Skills measured in renewal assessment:
Design a data storage solution for relational data
Design a data storage solution for non-relational data
Describe high availability and disaster recovery strategies
Design an Azure compute solution
Design an application architecture
Design network solutions
Design data integration
This is followed by hour-by-hour information on when the renewal exam will be available and a list of available learning paths and modules that can still be taken during the preparation time. Due to technological development, some modules are updated or new modules are added, so it is worth taking a look regularly.
It’s not a life-changing lifehack and 6 months (180 days) is really enough time to get your head around it. But if you have more than two or three certifications, scheduling certainly makes sense, so good luck with your next (re)certification.