[{"content":"Recently I was facing an issue where a reference in a nested Azure Resource Manager template was still being executed, despite the condition being false. I have created an ARM template that first sets a dynamic IP address on the Network Interface and later on converts it to a static IP address. It uses the initially assigned IP address and converts it to static.\nThe issue with a reference in a nested template with a condition In the example below, you see a nested template that converts the dynamic IP address to a static IP address. If the runTheNestedTemplate variable is set to true, everything will work fine as the condition is met. If the runTheNestedTemplate condition is set to false, it should work too, but unfortunately Azure will still try to validate the reference mentioned in the privateIPAddress property, resulting in a failing ARM template deployment if NIC01 does not exists.\n{ \u0026#34;apiVersion\u0026#34;: \u0026#34;2019-10-01\u0026#34;, \u0026#34;type\u0026#34;: \u0026#34;Microsoft.Resources/deployments\u0026#34;, \u0026#34;name\u0026#34;: \u0026#34;[concat(variables(\u0026#39;vmName\u0026#39;), \u0026#39;-UpdateNicToStaticIPAddress\u0026#39;)]\u0026#34;, \u0026#34;condition\u0026#34;: \u0026#34;[or(equals(parameters(\u0026#39;runTheNestedTemplate\u0026#39;), \u0026#39;true\u0026#39;))]\u0026#34;, \u0026#34;dependsOn\u0026#34;: [ \u0026#34;[resourceId(\u0026#39;Microsoft.Network/networkInterfaces/\u0026#39;, concat(variables(\u0026#39;vmName\u0026#39;), \u0026#39;-NIC01\u0026#39;))]\u0026#34; ], \u0026#34;properties\u0026#34;: { \u0026#34;mode\u0026#34;: \u0026#34;Incremental\u0026#34;, \u0026#34;parameters\u0026#34;: {}, \u0026#34;template\u0026#34;: { \u0026#34;$schema\u0026#34;: \u0026#34;https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#\u0026#34;, \u0026#34;contentVersion\u0026#34;: \u0026#34;1.0.0.0\u0026#34;, \u0026#34;parameters\u0026#34;: {}, \u0026#34;variables\u0026#34;: {}, \u0026#34;resources\u0026#34;: [ { \u0026#34;type\u0026#34;: \u0026#34;Microsoft.Network/networkInterfaces\u0026#34;, \u0026#34;name\u0026#34;: \u0026#34;[concat(variables(\u0026#39;vmName\u0026#39;), \u0026#39;-NIC01\u0026#39;)]\u0026#34;, \u0026#34;apiVersion\u0026#34;: \u0026#34;2018-03-01\u0026#34;, \u0026#34;location\u0026#34;: \u0026#34;[parameters(\u0026#39;region\u0026#39;)]\u0026#34;, \u0026#34;properties\u0026#34;: { \u0026#34;ipConfigurations\u0026#34;: [ { \u0026#34;name\u0026#34;: \u0026#34;ipconfig1\u0026#34;, \u0026#34;properties\u0026#34;: { \u0026#34;privateIPAllocationMethod\u0026#34;: \u0026#34;Static\u0026#34;, \u0026#34;privateIPAddress\u0026#34;: \u0026#34;[reference(concat(variables(\u0026#39;vmName\u0026#39;), \u0026#39;-NIC01\u0026#39;), \u0026#39;2020-03-01\u0026#39;).ipConfigurations[0].properties.privateIPAddress]\u0026#34;, \u0026#34;subnet\u0026#34;: { \u0026#34;id\u0026#34;: \u0026#34;[concat(variables(\u0026#39;vnetRef\u0026#39;),\u0026#39;/subnets/\u0026#39;, parameters(\u0026#39;vnetSubnetName\u0026#39;))]\u0026#34; } } } ] } } ] } } } Workaround As a workaround to resolve this issue, you have to wrap the reference in another if statement:\n{ \u0026#34;apiVersion\u0026#34;: \u0026#34;2019-10-01\u0026#34;, \u0026#34;type\u0026#34;: \u0026#34;Microsoft.Resources/deployments\u0026#34;, \u0026#34;name\u0026#34;: \u0026#34;[concat(variables(\u0026#39;vmName\u0026#39;), \u0026#39;-UpdateNicToStaticIPAddress\u0026#39;)]\u0026#34;, \u0026#34;condition\u0026#34;: \u0026#34;[or(equals(parameters(\u0026#39;runTheNestedTemplate\u0026#39;), \u0026#39;true\u0026#39;))]\u0026#34;, \u0026#34;dependsOn\u0026#34;: [ \u0026#34;[resourceId(\u0026#39;Microsoft.Network/networkInterfaces/\u0026#39;, concat(variables(\u0026#39;vmName\u0026#39;), \u0026#39;-NIC01\u0026#39;))]\u0026#34; ], \u0026#34;properties\u0026#34;: { \u0026#34;mode\u0026#34;: \u0026#34;Incremental\u0026#34;, \u0026#34;parameters\u0026#34;: {}, \u0026#34;template\u0026#34;: { \u0026#34;$schema\u0026#34;: \u0026#34;https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#\u0026#34;, \u0026#34;contentVersion\u0026#34;: \u0026#34;1.0.0.0\u0026#34;, \u0026#34;parameters\u0026#34;: {}, \u0026#34;variables\u0026#34;: {}, \u0026#34;resources\u0026#34;: [ { \u0026#34;type\u0026#34;: \u0026#34;Microsoft.Network/networkInterfaces\u0026#34;, \u0026#34;name\u0026#34;: \u0026#34;[concat(variables(\u0026#39;vmName\u0026#39;), \u0026#39;-NIC01\u0026#39;)]\u0026#34;, \u0026#34;apiVersion\u0026#34;: \u0026#34;2018-03-01\u0026#34;, \u0026#34;location\u0026#34;: \u0026#34;[parameters(\u0026#39;region\u0026#39;)]\u0026#34;, \u0026#34;properties\u0026#34;: { \u0026#34;ipConfigurations\u0026#34;: [ { \u0026#34;name\u0026#34;: \u0026#34;ipconfig1\u0026#34;, \u0026#34;properties\u0026#34;: { \u0026#34;privateIPAllocationMethod\u0026#34;: \u0026#34;Static\u0026#34;, \u0026#34;privateIPAddress\u0026#34;: \u0026#34;[if(equals(parameters(\u0026#39;runTheNestedTemplate\u0026#39;),\u0026#39;true\u0026#39;), reference(concat(variables(\u0026#39;vmName\u0026#39;), \u0026#39;-NIC01\u0026#39;), \u0026#39;2020-03-01\u0026#39;).ipConfigurations[0].properties.privateIPAddress, \u0026#39;Empty\u0026#39;)]\u0026#34;, \u0026#34;subnet\u0026#34;: { \u0026#34;id\u0026#34;: \u0026#34;[concat(variables(\u0026#39;vnetRef\u0026#39;),\u0026#39;/subnets/\u0026#39;, parameters(\u0026#39;vnetSubnetName\u0026#39;))]\u0026#34; } } } ] } } ] } } } I have reported this bug to Microsoft. In the meantime, feel free to use this workaround.\n","permalink":"https://devsecninja.com/2020/06/10/arm-template-nested-condition-with-reference-does-not-work/","summary":"\u003cp\u003eRecently I was facing an issue where a reference in a nested Azure Resource Manager template was still being executed, despite the condition being false. I have created an ARM template that first sets a dynamic IP address on the Network Interface and later on converts it to a static IP address. It uses the initially assigned IP address and converts it to static.\u003c/p\u003e\n\u003ch1 id=\"the-issue-with-a-reference-in-a-nested-template-with-a-condition\"\u003eThe issue with a reference in a nested template with a condition\u003c/h1\u003e\n\u003cp\u003eIn the example below, you see a nested template that converts the dynamic IP address to a static IP address. If the \u003ccode\u003erunTheNestedTemplate\u003c/code\u003e variable is set to true, everything will work fine as the condition is met. If the \u003ccode\u003erunTheNestedTemplate\u003c/code\u003e condition is set to false, it \u003cstrong\u003eshould\u003c/strong\u003e work too, but unfortunately Azure will still try to validate the reference mentioned in the \u003ccode\u003eprivateIPAddress\u003c/code\u003e property, resulting in a failing ARM template deployment if NIC01 does not exists.\u003c/p\u003e","title":"ARM Template - Nested condition with reference does not work"},{"content":"As I recently migrated my Home Assistant instance to my good old Intel NUC, I needed a new media center. I also promised myself to buy a new Game PC after completing my bachelor\u0026rsquo;s study program while working fulltime. What if I could combine these two things together, while the screens are in different rooms?\nI spent lots and lots of time reading reviews about hardware. Luckily I still remember how to build a machine from back in the days, so I ordered the following parts:\nType\nHardware\nProcessor\nAMD Ryzen 5 3600 Boxed\nMotherboard\nASRock Fatal1ty B450 Gaming-ITX/ac\nGraphics Card\nGigabyte Aorus Radeon RX 5700 XT 8G\nCase\nLian Li TU150 Window Zilver\nCPU Cooler\nbe quiet! Dark Rock 4\nFans\n2x Noctua NF-F12 PWM chromax.black.swap\nMemory\nCrucial Ballistix Sport LT BLS2K16G4D32AESB\nPower Supply\nCorsair SF600\nSSD\nIntel 660p 1TB\nGaming PC Hardware Specifications\nSome people might wonder why I picked the Gigabyte Aorus graphics card, because some cheaper models are available. I picked this specific graphics card because of the 3 (!) HDMI ports. We can debate on why you would prefer HDMI over DisplayPort, but there is one specific reason why I picked HDMI over DisplayPort.\nHDMI over UTP HDMI over UTP allows you to extend your HDMI signal over UTP. I use this to connect my monitor and both my television in the living room and the bedroom to my centralized PC. To get this to work, you need a high quality UTP CAT6 or higher cable and a UTP to HDMI converter. DIGITUS sells some great models such as the DIGITUS DS-55203 and the DIGITUS DS-55101 in case you need more range. I have both of them.\nDIGITUS DS-55203\nTo control the game PC from both the living and bedroom, I\u0026rsquo;m using the Logitech K400 and a K400 Plus.\nLogitech K400 Plus\nIf there is one thing I like to hide, it\u0026rsquo;s cables. 😁 So I took this to a new level by placing my game desktop centrally in the house and connect my main monitor with an HDMI cable through the wall.\nMy new monitor, but where is the new Gaming PC?\nThe Game PC on the other side of the wall\nI\u0026rsquo;ve been using this setup for a couple of weeks and I\u0026rsquo;m really happy with the reliability and ease of use. Although it would help if Windows showed the login screen and the notification area of the taskbar on all monitors.\nWhat do you think of the setup? Would you consider a centralized PC in your house? Let me know!\nStay safe you all!\n","permalink":"https://devsecninja.com/2020/03/21/my-new-centralized-gaming-pc-using-hdmi-over-utp/","summary":"\u003cp\u003e\u003cstrong\u003eAs I recently migrated my Home Assistant instance to my good old Intel NUC, I needed a new media center. I also promised myself to buy a new Game PC after completing my bachelor\u0026rsquo;s study program while working fulltime. What if I could combine these two things together, while the screens are in different rooms?\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003eI spent lots and lots of time reading reviews about hardware. Luckily I still remember how to build a machine from back in the days, so I ordered the following parts:\u003c/p\u003e","title":"My New Centralized Gaming PC using HDMI over UTP"},{"content":"**The OpenVPN Access Server appliance in Azure allows you to quickly setup an Azure Virtual Machine that you can use for VPN purposes. I\u0026rsquo;ve been using an OpenVPN Access Server myself for a couple of months now and I\u0026rsquo;m happy with the performance and features.\nEven on a very small VM size like the Standard B1ms with 1 vCPUs and 2 GiB of memory.\nAt the time of writing this blog post, the documentation of OpenVPN is a bit vague about updating your Azure OpenVPN Access Server, so I wanted to share with you what is needed to keep your OpenVPN Access Server up-to-date.**\nPrerequisites Before we start updating the OpenVPN Access Server, it\u0026rsquo;s important that you have an active SSH connection with root privileges. You can either get these by running sudo -i or by prefixing every command below with sudo. Additionally, I would recommend to create a snapshot of the Azure VM Disk to ensure that we can always go back to an earlier point in time when the update fails.\nWhat we know about the OpenVPN appliance The virtual appliance is based on Ubuntu. In my case, it\u0026rsquo;s based on Ubuntu 18.04 LTS. You can find this by running the command below.\nroot@server:~# cat /etc/*-release DISTRIB_ID=Ubuntu DISTRIB_RELEASE=18.04 DISTRIB_CODENAME=bionic DISTRIB_DESCRIPTION=\u0026#34;Ubuntu 18.04.4 LTS\u0026#34; NAME=\u0026#34;Ubuntu\u0026#34; VERSION=\u0026#34;18.04.4 LTS (Bionic Beaver)\u0026#34; Now that we know that the virtual appliance uses Ubuntu, we can check if the OpenVPN software is available on the sources list of apt:\nroot@server:~# cat /etc/apt/sources.list.d/openvpn-as-repo.list deb http://as-repository.openvpn.net/as/debian bionic main Great! So now we can check if the packages are actually visible in apt:\nroot@server:~# apt list | grep \u0026#39;openvpn\u0026#39; gadmin-openvpn-client/bionic 0.1.9-1 amd64 gadmin-openvpn-server/bionic 0.1.5-3.1build1 amd64 gadmin-openvpn-server-dbg/bionic 0.1.5-3.1build1 amd64 network-manager-openvpn/bionic 1.8.2-1 amd64 network-manager-openvpn-gnome/bionic 1.8.2-1 amd64 openvpn/bionic-updates 2.4.4-2ubuntu1.3 amd64 openvpn-as/bionic,now 2.7.5-932a08a3-Ubuntu18 amd64 [installed] openvpn-as-bundled-clients/bionic,now 3 all [installed,automatic] openvpn-auth-ldap/bionic 2.0.3-6.1ubuntu2 amd64 openvpn-auth-radius/bionic 2.1-6build1 amd64 openvpn-auth-radius-dbg/bionic 2.1-6build1 amd64 openvpn-systemd-resolved/bionic 1.2.7-1 amd64 Updating the OpenVPN Access Server on Azure Now that we know that the OpenVPN software is managed by apt, we can easily apply OpenVPN updates:\nroot@server:~# apt update \u0026amp;\u0026amp; apt upgrade Hit:1 http://azure.archive.ubuntu.com/ubuntu bionic InRelease Hit:2 http://azure.archive.ubuntu.com/ubuntu bionic-updates InRelease Hit:3 http://azure.archive.ubuntu.com/ubuntu bionic-backports InRelease Hit:4 http://as-repository.openvpn.net/as/debian bionic InRelease Get:5 http://security.ubuntu.com/ubuntu bionic-security InRelease [88.7 kB] Fetched 88.7 kB in 0s (181 kB/s) Reading package lists... Done Building dependency tree Reading state information... Done All packages are up to date. Reading package lists... Done Building dependency tree Reading state information... Done Calculating upgrade... Done 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. As you can see at Hit:4, apt contacts the OpenVPN update service to see if there is a new version available. Luckily I\u0026rsquo;m already fully up-to-date.\n","permalink":"https://devsecninja.com/2020/02/02/update-your-openvpn-access-server-on-azure/","summary":"\u003cp\u003e**The OpenVPN Access Server appliance in Azure allows you to quickly setup an Azure Virtual Machine that you can use for VPN purposes. I\u0026rsquo;ve been using an OpenVPN Access Server myself for a couple of months now and I\u0026rsquo;m happy with the performance and features.\u003c/p\u003e\n\u003cp\u003eEven on a very small VM size like the \u003cem\u003eStandard B1ms\u003c/em\u003e with 1 vCPUs and 2 GiB of memory.\u003c/p\u003e\n\u003cp\u003eAt the time of writing this blog post, the documentation of OpenVPN is a bit vague about updating your Azure OpenVPN Access Server, so I wanted to share with you what is needed to keep your OpenVPN Access Server up-to-date.**\u003c/p\u003e","title":"Update your OpenVPN Access Server on Azure"},{"content":"**In this blog post, I wrote down my considerations \u0026amp; requirements around finding my next password manager. I\u0026rsquo;ve been a LastPass Family user for several years, but I am going to switch to a new password manager.\nIn 2015, LastPass was sold to LogMeIn.\nIt didn\u0026rsquo;t feel good, but it wasn\u0026rsquo;t a deal breaker for me.\nRecent security issues issues like this one or this one made me freak out a bit.\nLast but not least, the parent company of LastPass - LogMeIn - sold itself to private equity firms.\nThat was it - I needed a new password manager and while I\u0026rsquo;m on this journey to find a new password manager, I\u0026rsquo;m sharing it with the rest of the world.** I hope this blog post will help you in finding your ultimate password manager.\nRequirements LastPass offers a very rich feature set, so I wanted to make sure that I capture everything I want from a new password manager.\nFeature Lastpass 1Password DashLane Bitwarden Password Generator X X X X Family package X X - X Share credentials with family X X ~ X MFA with YubiKey X X X X Emergency Access X - X - Store sites \u0026amp; notes X X X X Easy import from LastPass N/A X X X Stats about how safe my passwords are X X X X iOS App, integration with autofill and with Touch/Face ID X X X X MSFT Edge extension X X X X Servers hosted in Europe ~ X X - Automatic password rotation X - X - Easy-to-use password suggestions X X X X Support easily accessible X X X X Detailed release notes are available X X X X Remove account without contacting support X X X X Price per month in EUR Premium: 2,67, Family: 3,56 Premium: 2,65, Family: 4,75 Premium: 3,33 Premium: 0,74, Family: 0,90, Family Prem: 3,88 LastPass I\u0026rsquo;ve been a LastPass Premium/Family user for 5 years now and while I\u0026rsquo;m happy with their services, I also feel that they can do more on innovation and improving the product. The Edge Chromium extension isn\u0026rsquo;t as stable as I would like to see and isn\u0026rsquo;t as polished as 1Password.\nYou might have seen that the table above lists a ~ with the \u0026lsquo;Servers hosted in Europe\u0026rsquo; requirement. LastPass offers the option to store your vault in Europe as mentioned on their support page. Account data will still be hosted in both the United States and Europe, but it isn\u0026rsquo;t in a separate tenant like 1Password does. I will dive deeper into this in the next chapter.\n1Password Stats about how safe my passwords are The Watchtower feature gives you insights on the health of the items in the vault. It tells you when an account was registered at a compromised or insecure website. It will also tell you when you have expiring entities. Additionally, you can compare your passwords with the HaveIBeenPwned.com database to find out if you have any compromised accounts.\nServers hosted in Europe 1Password offers their services in the US region with 1Password.com, 1Password.ca for Canada and in the Europe region with 1Password.eu. Right from the start you can register in a certain region which is completely separated from the other regions. Registering an account in the European region is a bit more expensive. With the currency rate today (27th of December, 2019), it\u0026rsquo;s 29 euro cents more expensive to buy the Family plan with 1Password.eu: $ 4.99 vs € 4,75.\n1Passwords states that they are GDPR compliant, which is something you don\u0026rsquo;t see organizations mentioning often. Besides that, 1Password says that they are using the Amazon AWS EU Frankfurt region based in Germany.\nAdditional Pros Multiple Vaults\nWith LastPass I\u0026rsquo;ve used folders in the beginning but I mostly use the search function.\nThis now results in a completely uncategorized Password Manager.\nBeing able to create separate vaults with 1Password makes way more sense: one for private, one shared and one for work.\nYou can even mark a vault as \u0026lsquo;Safe for Travel\u0026rsquo;, so it will keep the vault on the device during travel.\nOne thing I would like to see from 1Password is adding additional controls to these vaults. E.g. accessing the Private vault requires re-authentication with a second factor. I see a lot of potential in this feature.\nWhile it is possible to change the icons of the vault to easily identify a vault, I would have liked to see a possibility to select some standard icons.\nLots of options while creating a new entry\nI always liked the idea of using \u0026rsquo;templates\u0026rsquo; before you create an entity in LastPass. When you are adding server credentials to your vault, it will show you a form with fields for the server name, username and password. 1Password has even more templates that you can use:\nThere are also multiple fields (or labels) that you can configure when creating a new entity. With LastPass, I always ended up adding more details to the notes section, so I really like this approach. When compared to other password managers, 1Password does this really well:\nSaving Multi-Factor Authentication codes in 1Password\nThis is a feature I\u0026rsquo;m not sure if I like it or not. One one hand, I think that it would stimulate users to activate multi-factor authentication on web sites, as 1Password will take care of filling in the one-time password. But on the other hand, Multi-Factor Authentication enforces a user to keep two things separate: something you know and something you have, for example. If you put all your eggs in 1Password\u0026rsquo;s basket, a hacker only needs access to 1Password to compromise all your accounts.\nAn important thing to mention is that it doesn\u0026rsquo;t seem to be possible to export your Multi-Factor Authentication entries (or \u0026lsquo;seeds\u0026rsquo;). While you can easily export your passwords and notes to move to another password manager, this might result in a 1Password lock-in. If you often use Multi-Factor Authentication and you want to move to another password manager, you have to manually reconfigure tens or hundreds of seeds to your platform of choice. Something to think about\u0026hellip;\nThings to improve for 1Password Multi-Factor Authentication during registration\nMulti-Factor Authentication is such an important feature to prevent cyber criminals from stealing your secrets. I would have liked to see 1Password prompt me to configure MFA during the registration process, instead of doing it manually in the settings menu afterwards.\nThe bad thing is that it\u0026rsquo;s even hidden under the menu item called \u0026ldquo;My Profile\u0026rdquo;.\nAfter opening your Profile, you have to click on the \u0026ldquo;More Actions\u0026rdquo; menu to get it configured.\nOther password managers also don\u0026rsquo;t prompt you actively to enable MFA, but 1Password makes it hard to find by hiding it in a dropdown menu:\nHiding the Two-Factor Authentication Menu under More Actions\nYubiKey for Multi-Factor Authentication only available after configuring the Authenticator App\nOne of the first things I wanted to check out is the integration with YubiKeys for Multi-Factor Authentication.\nUnfortunately, this option wasn\u0026rsquo;t available until I configured the authenticator app for Two-Factor Authentication.\nThe screen below will just show you the option to configure the Authenticator App, without mentioning anything about a YubiKey. I would be nice if the 1Password team can make this a greyed out option stating that the Authenticator App needs to be configured first.\nNo option to configure YubiKeys\nAfter setting up the Authenticator App as a second factor, the YubiKey setup was a breeze:\nEmergency Access\nOne of the features I really like from LastPass is the emergency access feature. It feels way more comfortable that my family has access to my account when they need it. Especially in this digital day and age, someones online footprint is huge. Being able to provide access to my online accounts when I\u0026rsquo;m in an emergency feels good and also forces me to keep my vault organized.\n1Password does have a feature to reset a family members password, but that still requires access to someone\u0026rsquo;s email account. It\u0026rsquo;s a nice feature to have when a team member forgets it\u0026rsquo;s password, but it\u0026rsquo;s not the same as emergency access. If you use this feature, make sure to select multiple family owners as owners can only reset passwords of members!\nConclusion on 1Password 1Password overall feels pretty solid - a bit more polished when compared to LastPass. I really like the Watchtower feature to keep my passwords and other details \u0026lsquo;healthy\u0026rsquo;. The lack of the emergency access feature is a deal breaker to me, especially with the higher price when compared to LastPass.\nDashlane Dashlane is, when compared to 1Password, rather vague about where the data is hosted. I found a blog post and Subprocessor list stating that their web services and databases are hosted on Amazon Web Services in the Dublin (Ireland) region. I recommend Dashlane to mention this on a more prominent location.\nWhile I expect that a Brexit won\u0026rsquo;t change anything immediately with regards to data privacy, it\u0026rsquo;s something to keep an eye out for.\nAfter registering at Dashlane, I got prompted to download the Dashlane app instead of the browser extension.\nWhen the Dashlane app was installing, I started to download the extension for Microsoft Edge (Chromium). I quickly realized that the app for Windows has way more functions when compared to the browser extension, as you can see below.\nAlso the app doesn\u0026rsquo;t seem to scale correctly on a 4K screen. I really like the idea of being able to use the browser extension, so this is a dealbreaker to me:\nConclusion on Dashlane After spending quite some time finding out where Dashlane is hosted and under which regulations they fall, I started to wonder why I have to do all that while 1Password just mentions it on one of their pages. After finding out that the Windows app has way more functionality than the extension and that there is no family plan, I started to look for other options.\nOn a positive note, I really like the direction that Dashlane is heading. Especially with support for Windows Hello to unlock your vault and for the fact that they include a VPN service with the premium option. If you\u0026rsquo;re not looking for family features and don\u0026rsquo;t mind using the Windows app, Dashlane is still a solid option.\nBitwarden Use Command Line tools to access the vault A feature I was really excited about is the command line support from Bitwarden. You can use this tool on several platforms to manage your vault. On Windows, you have to work with an executable called bw.exe, which you can use to login with bw login \u0026lt;email\u0026gt; \u0026lt;password\u0026gt;. With bw list items and bw get you can retrieve the objects available in the vault. If you have an organization account, you can use the RESTful API to manage your vault. Awesomeness!\nUse Command Line tools to access the Bitwarden vault - Source\nPricing This was me, seeing the pricing of Bitwarden for the first time:\nWhile other password managers are charging at least 2 or 3 euro\u0026rsquo;s for premium, Bitwarden charges 74 euro cents per month! That\u0026rsquo;s such a big difference I\u0026rsquo;m starting to wonder how Bitwarden can maintain the platform. But on the other hand, their Families and Teams pricing seems to be reasonable.\nThings to improve for Bitwarden Hosted Bitwarden in Europe region\nUnfortunately, Bitwarden does not offer to host a vault in the Europe region, like 1Password and LastPass. Of course you can decide to host the free \u0026ldquo;self-hosted\u0026rdquo; installation of Bitwarden. Personally, I don\u0026rsquo;t want to manage a critical service like a password manager. I think that organizations like Bitwarden, 1Password and LastPass are way more capable of doing that with dedicated operations teams.\nEmergency Access\nUnfortunately, an Emergency Access for Bitwarden isn\u0026rsquo;t available.\nConclusion All 4 Password Managers I\u0026rsquo;ve looked into are solid options. The basic features are all built-in, but features like Emergency Access and hosting in Europe can make a big difference.\nI can really recommend to write down what you expect from a password manager and request a trial for at least 2 or 3 password managers. A password manager is a tool you use often and might use for the next couple of years.\nWhat am I going to do? Personally I prefer what 1Password has to offer, except for the fact that they are missing an emergency access feature. When 1Password will offer that, I\u0026rsquo;m going to make the switch to the European version of 1Password.\nThis blog post isn\u0026rsquo;t sponsored and based on my opinion.\n","permalink":"https://devsecninja.com/2020/01/05/finding-the-ultimate-password-manager/","summary":"\u003cp\u003e**In this blog post, I wrote down my considerations \u0026amp; requirements around finding my next password manager. I\u0026rsquo;ve been a LastPass Family user for several years, but I am going to switch to a new password manager.\u003c/p\u003e\n\u003cp\u003eIn 2015, LastPass was sold to LogMeIn.\u003c/p\u003e\n\u003cp\u003eIt didn\u0026rsquo;t feel good, but it wasn\u0026rsquo;t a deal breaker for me.\u003c/p\u003e\n\u003cp\u003eRecent security issues issues like \u003ca href=\"https://www.forbes.com/sites/daveywinder/2019/09/16/google-warns-lastpass-users-were-exposed-to-last-password-credential-leak/\"\u003ethis one\u003c/a\u003e or \u003ca href=\"https://www.pcworld.com/article/3184153/lastpass-fixes-serious-password-leak-vulnerabilities.html\"\u003ethis one\u003c/a\u003e made me freak out a bit.\u003c/p\u003e","title":"Finding the Ultimate Password Manager"},{"content":"**My alarm clock setup is based on a Sonos One that plays music and Home Assistant in combination with DeCONZ and Node-RED to automate my bedroom lights as a wake up light. I could also control the music to the Sonos One from Home Assistant, but I don\u0026rsquo;t want to rely on my home automation setup for me to wake up.\nCurrently, the Home Assistant integration does not get the alarm data itself, but luckily the Sonos API is easily accessible!**\nIn this blog post, I\u0026rsquo;m going to explain two methods that you can use to gather the alarm clock data: PowerShell and Node-RED.\nPowerShell First of all, you can use PowerShell by running the script below. Make sure to change the variables on top before you run the script:\n#region Variables # Sonos Settings - Edit where necessary $sonosIP = \u0026#34;192.168.1.10\u0026#34; $port = 1400 # SOAP - Do not edit $uri = \u0026#34;http://${sonosIP}:$port/AlarmClock/Control\u0026#34; $soapAction = \u0026#34;urn:schemas-upnp-org:service:AlarmClock:1#ListAlarms\u0026#34; $soapMessage = \u0026#39;\u0026lt;s:Envelope xmlns:s=\u0026#34;http://schemas.xmlsoap.org/soap/envelope/\u0026#34; s:encodingStyle=\u0026#34;http://schemas.xmlsoap.org/soap/encoding/\u0026#34;\u0026gt;\u0026lt;s:Body\u0026gt;\u0026lt;u:ListAlarms xmlns:u=\u0026#34;urn:schemas-upnp-org:service:AlarmClock:1\u0026#34;\u0026gt;\u0026lt;InstanceID\u0026gt;0\u0026lt;/InstanceID\u0026gt;\u0026lt;/u:ListAlarms\u0026gt;\u0026lt;/s:Body\u0026gt;\u0026lt;/s:Envelope\u0026gt;\u0026#39; #endregion #region Execution # Create SOAP Request $soapRequest = \\[System.Net.WebRequest\\]::Create($uri) # Set Headers $soapRequest.Accept = \u0026#39;gzip\u0026#39; $soapRequest.Method = \u0026#39;POST\u0026#39; $soapRequest.ContentType = \u0026#39;text/xml; charset=\u0026#34;utf-8\u0026#34;\u0026#39; $soapRequest.KeepAlive = $false $soapRequest.Headers.Add(\u0026#34;SOAPACTION\u0026#34;, $soapAction) # Sending SOAP Request $requestStream = $soapRequest.GetRequestStream() $soapMessage = \\[xml\\] $soapMessage $soapMessage.Save($requestStream) $requestStream.Close() # Sending Complete, Get Response $response = $soapRequest.GetResponse() $responseStream = $response.GetResponseStream() $soapReader = \\[System.IO.StreamReader\\]($responseStream) $returnXml = \\[Xml\\] $soapReader.ReadToEnd() $responseStream.Close() #endregion # Get alarms ((\\[xml\\]$returnXml.Envelope.Body.ListAlarmsResponse.CurrentAlarmList).Alarms.Alarm)\\[0\\] After running this script, you will see that it will return the first alarm. If you want to search for other alarms, you can play with the $returnXml variable.\nNode-RED As I\u0026rsquo;m using Node-RED extensively with Home Assistant, I can integrate a SOAP request pretty easily with Node-RED. All we need is the node-red-contrib-simple-soap package, which you can add to your Hass.io Node-RED configuration for easy installation:\n\u0026#34;npm\\_packages\u0026#34;: \\[ \u0026#34;node-red-contrib-simple-soap\u0026#34; \\] We apply the same logic as we used in PowerShell to Node-RED, to end up with something like this:\nFlow in Node-RED that gathers the Sonos alarm clock data\nIn this scenario, I trigger a flow every minute, which:\nSends out the SOAP Request Parses the XML that comes from the SOAP request Grabs the alarm time \u0026amp; state from the first alarm Sets the time on the Sonos alarm sensor Sets the Sonos alarm boolean to True or False You can easily add the Sonos alarm inputs to your Home Assistant configuration:\n\\# Input booleans input\\_boolean: wakeup\\_sonos\\_alarm\\_enabled: name: \u0026#34;Sonos Alarm Enabled\u0026#34; icon: mdi:theme-light-dark # Input date time input\\_datetime: wakeup\\_sonos\\_alarm\\_time: name: \u0026#34;Sonos Alarm Time\u0026#34; has\\_time: true has\\_date: false After you\u0026rsquo;ve rebooted your Home Assistant, the sensor should come up with the new inputs. Import the flow below into Node-RED and make sure you configure your own Home Assistant server and change the IP address of the Sonos in the SOAP request.\n\\[{\u0026#34;id\u0026#34;:\u0026#34;721f6847.34cc88\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;simple-soap\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;24b5739b.054a3c\u0026#34;,\u0026#34;host\u0026#34;:\u0026#34;http://192.168.1.10:1400\u0026#34;,\u0026#34;hostType\u0026#34;:\u0026#34;str\u0026#34;,\u0026#34;path\u0026#34;:\u0026#34;AlarmClock/Control\u0026#34;,\u0026#34;pathType\u0026#34;:\u0026#34;str\u0026#34;,\u0026#34;action\u0026#34;:\u0026#34;urn:schemas-upnp-org:service:AlarmClock:1#ListAlarms\u0026#34;,\u0026#34;actionType\u0026#34;:\u0026#34;str\u0026#34;,\u0026#34;body\u0026#34;:\u0026#34;\u0026#39;\u0026lt;s:Envelope xmlns:s=\\\\\u0026#34;http://schemas.xmlsoap.org/soap/envelope/\\\\\u0026#34; s:encodingStyle=\\\\\u0026#34;http://schemas.xmlsoap.org/soap/encoding/\\\\\u0026#34;\u0026gt;\u0026lt;s:Body\u0026gt;\u0026lt;u:ListAlarms xmlns:u=\\\\\u0026#34;urn:schemas-upnp-org:service:AlarmClock:1\\\\\u0026#34;\u0026gt;\u0026lt;InstanceID\u0026gt;0\u0026lt;/InstanceID\u0026gt;\u0026lt;/u:ListAlarms\u0026gt;\u0026lt;/s:Body\u0026gt;\u0026lt;/s:Envelope\u0026gt;\u0026#39;\u0026#34;,\u0026#34;bodyType\u0026#34;:\u0026#34;msg\u0026#34;,\u0026#34;mustache\u0026#34;:false,\u0026#34;attrkey\u0026#34;:\u0026#34;$\u0026#34;,\u0026#34;charkey\u0026#34;:\u0026#34;\\_\u0026#34;,\u0026#34;stripPrefix\u0026#34;:false,\u0026#34;simplify\u0026#34;:false,\u0026#34;normalizeTags\u0026#34;:false,\u0026#34;normalize\u0026#34;:false,\u0026#34;topic\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;SOAP Request\u0026#34;,\u0026#34;useAuth\u0026#34;:false,\u0026#34;x\u0026#34;:400,\u0026#34;y\u0026#34;:1940,\u0026#34;wires\u0026#34;:\\[\\[\u0026#34;90be7158.69234\u0026#34;\\]\\]},{\u0026#34;id\u0026#34;:\u0026#34;e9a4e1fc.10c98\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;inject\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;24b5739b.054a3c\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;Every minute\u0026#34;,\u0026#34;topic\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;payload\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;payloadType\u0026#34;:\u0026#34;date\u0026#34;,\u0026#34;repeat\u0026#34;:\u0026#34;60\u0026#34;,\u0026#34;crontab\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;once\u0026#34;:true,\u0026#34;onceDelay\u0026#34;:0.1,\u0026#34;x\u0026#34;:140,\u0026#34;y\u0026#34;:1940,\u0026#34;wires\u0026#34;:\\[\\[\u0026#34;721f6847.34cc88\u0026#34;\\]\\]},{\u0026#34;id\u0026#34;:\u0026#34;90be7158.69234\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;xml\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;24b5739b.054a3c\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;Parse XML\u0026#34;,\u0026#34;property\u0026#34;:\u0026#34;payload\\[\\\\\u0026#34;s:Envelope\\\\\u0026#34;\\]\\[\\\\\u0026#34;s:Body\\\\\u0026#34;\\]\\[\\\\\u0026#34;u:ListAlarmsResponse\\\\\u0026#34;\\].CurrentAlarmList\u0026#34;,\u0026#34;attr\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;chr\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;x\u0026#34;:130,\u0026#34;y\u0026#34;:2060,\u0026#34;wires\u0026#34;:\\[\\[\u0026#34;2884e5f3.9198ca\u0026#34;,\u0026#34;c8f07bd4.b12338\u0026#34;\\]\\]},{\u0026#34;id\u0026#34;:\u0026#34;2884e5f3.9198ca\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;change\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;24b5739b.054a3c\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;Set time payload\u0026#34;,\u0026#34;rules\u0026#34;:\\[{\u0026#34;t\u0026#34;:\u0026#34;set\u0026#34;,\u0026#34;p\u0026#34;:\u0026#34;payload\u0026#34;,\u0026#34;pt\u0026#34;:\u0026#34;msg\u0026#34;,\u0026#34;to\u0026#34;:\u0026#34;payload\\[\\\\\u0026#34;s:Envelope\\\\\u0026#34;\\]\\[\\\\\u0026#34;s:Body\\\\\u0026#34;\\]\\[\\\\\u0026#34;u:ListAlarmsResponse\\\\\u0026#34;\\].CurrentAlarmList.Alarms.Alarm\\[0\\].$.StartTime\u0026#34;,\u0026#34;tot\u0026#34;:\u0026#34;msg\u0026#34;}\\],\u0026#34;action\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;property\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;from\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;to\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;reg\u0026#34;:false,\u0026#34;x\u0026#34;:410,\u0026#34;y\u0026#34;:2040,\u0026#34;wires\u0026#34;:\\[\\[\u0026#34;c6a9b36.c1ebd5\u0026#34;\\]\\]},{\u0026#34;id\u0026#34;:\u0026#34;c6a9b36.c1ebd5\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;api-call-service\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;24b5739b.054a3c\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;Set Sonos alarm sensor\u0026#34;,\u0026#34;server\u0026#34;:\u0026#34;47f73e49.02b17\u0026#34;,\u0026#34;version\u0026#34;:1,\u0026#34;debugenabled\u0026#34;:false,\u0026#34;service\\_domain\u0026#34;:\u0026#34;input\\_datetime\u0026#34;,\u0026#34;service\u0026#34;:\u0026#34;set\\_datetime\u0026#34;,\u0026#34;entityId\u0026#34;:\u0026#34;input\\_datetime.wakeup\\_sonos\\_alarm\\_time\u0026#34;,\u0026#34;data\u0026#34;:\u0026#34;{\\\\\u0026#34;time\\\\\u0026#34;:\\\\\u0026#34;{{payload}}\\\\\u0026#34;}\u0026#34;,\u0026#34;dataType\u0026#34;:\u0026#34;json\u0026#34;,\u0026#34;mergecontext\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;output\\_location\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;output\\_location\\_type\u0026#34;:\u0026#34;none\u0026#34;,\u0026#34;mustacheAltTags\u0026#34;:false,\u0026#34;x\u0026#34;:710,\u0026#34;y\u0026#34;:2040,\u0026#34;wires\u0026#34;:\\[\\[\\]\\]},{\u0026#34;id\u0026#34;:\u0026#34;c8f07bd4.b12338\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;change\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;24b5739b.054a3c\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;Set enabled payload\u0026#34;,\u0026#34;rules\u0026#34;:\\[{\u0026#34;t\u0026#34;:\u0026#34;set\u0026#34;,\u0026#34;p\u0026#34;:\u0026#34;payload\u0026#34;,\u0026#34;pt\u0026#34;:\u0026#34;msg\u0026#34;,\u0026#34;to\u0026#34;:\u0026#34;payload\\[\\\\\u0026#34;s:Envelope\\\\\u0026#34;\\]\\[\\\\\u0026#34;s:Body\\\\\u0026#34;\\]\\[\\\\\u0026#34;u:ListAlarmsResponse\\\\\u0026#34;\\].CurrentAlarmList.Alarms.Alarm\\[0\\].$.Enabled\u0026#34;,\u0026#34;tot\u0026#34;:\u0026#34;msg\u0026#34;}\\],\u0026#34;action\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;property\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;from\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;to\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;reg\u0026#34;:false,\u0026#34;x\u0026#34;:420,\u0026#34;y\u0026#34;:2100,\u0026#34;wires\u0026#34;:\\[\\[\u0026#34;3e848197.6a535e\u0026#34;\\]\\]},{\u0026#34;id\u0026#34;:\u0026#34;235cde14.721e92\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;api-call-service\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;24b5739b.054a3c\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;Disable Sonos alarm boolean\u0026#34;,\u0026#34;server\u0026#34;:\u0026#34;47f73e49.02b17\u0026#34;,\u0026#34;version\u0026#34;:1,\u0026#34;debugenabled\u0026#34;:false,\u0026#34;service\\_domain\u0026#34;:\u0026#34;input\\_boolean\u0026#34;,\u0026#34;service\u0026#34;:\u0026#34;turn\\_off\u0026#34;,\u0026#34;entityId\u0026#34;:\u0026#34;input\\_boolean.wakeup\\_sonos\\_alarm\\_enabled\u0026#34;,\u0026#34;data\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;dataType\u0026#34;:\u0026#34;json\u0026#34;,\u0026#34;mergecontext\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;output\\_location\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;output\\_location\\_type\u0026#34;:\u0026#34;none\u0026#34;,\u0026#34;mustacheAltTags\u0026#34;:false,\u0026#34;x\u0026#34;:1060,\u0026#34;y\u0026#34;:2060,\u0026#34;wires\u0026#34;:\\[\\[\\]\\]},{\u0026#34;id\u0026#34;:\u0026#34;3e848197.6a535e\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;switch\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;24b5739b.054a3c\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;Enable or Disable\u0026#34;,\u0026#34;property\u0026#34;:\u0026#34;payload\u0026#34;,\u0026#34;propertyType\u0026#34;:\u0026#34;msg\u0026#34;,\u0026#34;rules\u0026#34;:\\[{\u0026#34;t\u0026#34;:\u0026#34;eq\u0026#34;,\u0026#34;v\u0026#34;:\u0026#34;0\u0026#34;,\u0026#34;vt\u0026#34;:\u0026#34;str\u0026#34;},{\u0026#34;t\u0026#34;:\u0026#34;eq\u0026#34;,\u0026#34;v\u0026#34;:\u0026#34;1\u0026#34;,\u0026#34;vt\u0026#34;:\u0026#34;str\u0026#34;}\\],\u0026#34;checkall\u0026#34;:\u0026#34;true\u0026#34;,\u0026#34;repair\u0026#34;:false,\u0026#34;outputs\u0026#34;:2,\u0026#34;x\u0026#34;:690,\u0026#34;y\u0026#34;:2100,\u0026#34;wires\u0026#34;:\\[\\[\u0026#34;235cde14.721e92\u0026#34;\\],\\[\u0026#34;483dac88.5925a4\u0026#34;\\]\\]},{\u0026#34;id\u0026#34;:\u0026#34;483dac88.5925a4\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;api-call-service\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;24b5739b.054a3c\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;Enable Sonos alarm boolean\u0026#34;,\u0026#34;server\u0026#34;:\u0026#34;47f73e49.02b17\u0026#34;,\u0026#34;version\u0026#34;:1,\u0026#34;debugenabled\u0026#34;:false,\u0026#34;service\\_domain\u0026#34;:\u0026#34;input\\_boolean\u0026#34;,\u0026#34;service\u0026#34;:\u0026#34;turn\\_on\u0026#34;,\u0026#34;entityId\u0026#34;:\u0026#34;input\\_boolean.wakeup\\_sonos\\_alarm\\_enabled\u0026#34;,\u0026#34;data\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;dataType\u0026#34;:\u0026#34;json\u0026#34;,\u0026#34;mergecontext\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;output\\_location\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;output\\_location\\_type\u0026#34;:\u0026#34;none\u0026#34;,\u0026#34;mustacheAltTags\u0026#34;:false,\u0026#34;x\u0026#34;:1060,\u0026#34;y\u0026#34;:2140,\u0026#34;wires\u0026#34;:\\[\\[\\]\\]},{\u0026#34;id\u0026#34;:\u0026#34;47f73e49.02b17\u0026#34;,\u0026#34;type\u0026#34;:\u0026#34;server\u0026#34;,\u0026#34;z\u0026#34;:\u0026#34;\u0026#34;,\u0026#34;name\u0026#34;:\u0026#34;Home Assistant\u0026#34;,\u0026#34;legacy\u0026#34;:false,\u0026#34;hassio\u0026#34;:true,\u0026#34;rejectUnauthorizedCerts\u0026#34;:true,\u0026#34;ha\\_boolean\u0026#34;:\u0026#34;y|yes|true|on|home|open\u0026#34;,\u0026#34;connectionDelay\u0026#34;:true}\\] Now you can use this new sensor to automate other things like enabling the bedroom lights when the Sonos alarm is set, or turn on the coffee machine after waking up!\nLet me know how you will use this new flow to automate your home!\n","permalink":"https://devsecninja.com/2019/12/15/use-the-sonos-api-to-gather-alarm-data/","summary":"\u003cp\u003e**My alarm clock setup is based on a Sonos One that plays music and \u003ca href=\"http://DevSecNinja.com/2019/08/03/bye-philips-hue-xiaomi-bridge/\"\u003eHome Assistant in combination with DeCONZ\u003c/a\u003e and Node-RED to automate my bedroom lights as a wake up light. I could also control the music to the Sonos One from Home Assistant, but I don\u0026rsquo;t want to rely on my home automation setup for me to wake up.\u003c/p\u003e\n\u003cp\u003eCurrently, the Home Assistant integration does not get the alarm data itself, but luckily the Sonos API is easily accessible!**\u003c/p\u003e","title":"Use the Sonos API to gather alarm data"},{"content":"**Back in 2011 when I was 17 years old I started with my studies on \u0026lsquo;senior secondary vocational education\u0026rsquo; (MBO).\nThis was the first time I fell in love with technology.\nThe more time I spent on training and certifications, the more exemptions I received from teachers.\nAfter getting my diploma, I wanted to start working with all the new technologies in enterprise environments \u0026amp; apply what I learned throughout my certification journey. I started to work at Avanade but always with the dream to pursue my bachelor\u0026rsquo;s degree.\nDistance learning was the best option while starting a hectic career in IT consultancy.** As a 19 year old, starting a career in IT consultancy is not the easiest job.\nEspecially if it\u0026rsquo;s your first job in an industry.\nMost students start working at a small service desk or local IT support but I wanted to put the theory I\u0026rsquo;ve learned with my 10 certifications into practice.\nClient projects can be stressful with very strict deadlines.\nIn such a position as a young IT consultant, I felt like I had to prove myself in order to be able to get the best results for the client.\nDistance learning After working on projects for 6 months, I learned that I needed a flexible Bachelor\u0026rsquo;s study program to be able to align school assignments with a hectic career.\nDistance learning was the perfect option, because I could slow down the pace during stressful moments and speed it up when a project is going well.\nThe second benefit is that I don\u0026rsquo;t have to travel to a specific location. I\u0026rsquo;ve had projects in cities like Amsterdam, Utrecht, Amersfoort, The Hague and Hoofddorp.\nCommute mostly takes around 1 to 4 hours per day.\nAdding up one or two hours of commute to a university is not an option.\nIn the Netherlands, there are a couple of universities like LOI and NCOI, that provide distance learning options for bachelor studies in information technology.\nBoth study programs have an internationally recognized accreditation from the NVAO and the Dutch Ministry of Education, Culture and Science.\nThis accreditation from the NVAO was very important to me. I applied for the 4-year \u0026ldquo;Technische Informatica\u0026rdquo; bachelor studies at the LOI.\nWhatever it takes The amount of dedication required to complete a Bachelor\u0026rsquo;s study program based on distance learning is enormous. Nobody is going to get in touch with you when you don\u0026rsquo;t make any progress. You\u0026rsquo;re on your own.\nYou have to do it yourself. The only feedback you will get is from the teachers \u0026lsquo;sitting behind the portal\u0026rsquo;.\nYou will get a short chat with a coach, but that\u0026rsquo;s about it.\nThere won\u0026rsquo;t be a recurring meeting with your coach.\nIt would have been nice to be able to speak with teachers on the phone or a FaceTime call.\nEspecially when you want some clarification on a statement of a teacher.\nSeveral people asked me what it takes to do this study program. I tend to ask the following question back: \u0026ldquo;what are you able to sacrifice?\u0026rdquo; If you want to complete the study program within 4 to 5 years, you have to take sacrifices.\nThis means you won\u0026rsquo;t be able to go to lots of parties or have drink with friends that often.\nYou won\u0026rsquo;t be able to binge-watch Netflix series often or participate in several sports.\nThe resistance to study is high when there are so many fun things to do.\nTime management If you want to both work and study, you have to carefully manage your time. If you have to drive to work instead of taking the train, the last thing you want is to lose valuable time in traffic jam and increased fatigue. Use the time in the early morning to study or go home directly after work. My regular schedule roughly looked like this:\n05:15 - Wake up 06:00 - 07:00 - Commute 07:00 - 16:00 - Work 16:00 - 17:00 - Commute 17:30 - Dinner 18:00 - Study \u0026amp; rest till bedtime 21:00 - Bedtime Consistency \u0026amp; rhythm is key, so I also woke up at 05:15 on Saturday and Sunday. Based on how I felt in the weekend, I studied a full day and took rest the other day.\nCompleting the journey The study programs of the LOI are very affordable. You will see the drawbacks of this very quickly.\nThe theory is often not aligned with exams and forces you to take the exam several times and learn from the mistakes.\nSeveral students I know stopped with the study program within the first two years.\nAlso the customer service isn\u0026rsquo;t very responsive to say the least.\nAs an example, it took the university 6 months (!) to schedule my thesis defense presentation after my thesis was finalized.\nThe only thing I could do is call the customer service on a weekly or daily basis, wait in the call queue for 30 minutes to hear that I had to wait for another week.\nDuring the thesis defense presentation, the examiner told me that only 10 percent of the students completes a study program at this university. I assume this is due to the dedication required to both survive the study program but also deal with the drawbacks of a university with only affordable study programs.\nBachelor of Science I\u0026rsquo;m proud to say that I successfully defended my thesis about Privileged Identity \u0026amp; Access Management and therefore received my diploma recently. As I have access to Master degree programs, I\u0026rsquo;m looking forward to see what the future holds.\n","permalink":"https://devsecninja.com/2019/08/03/what-it-takes-to-pursue-a-bachelors-degree-by-following-a-distance-learning-program/","summary":"\u003cp\u003e**Back in 2011 when I was 17 years old I started with my studies on \u0026lsquo;senior secondary vocational education\u0026rsquo; (MBO).\u003c/p\u003e\n\u003cp\u003eThis was the first time I fell in love with technology.\u003c/p\u003e\n\u003cp\u003eThe more time I spent on training and certifications, the more exemptions I received from teachers.\u003c/p\u003e\n\u003cp\u003eAfter getting my diploma, I wanted to start working with all the new technologies in enterprise environments \u0026amp; apply what I learned throughout my certification journey. I started to work at Avanade but always with the dream to pursue my bachelor\u0026rsquo;s degree.\u003c/p\u003e","title":"What it takes to pursue a bachelor's degree by following a distance learning program"},{"content":"**My smart home journey started years ago with a simple Philips Hue configuration.\nAfter integrating Home Assistant, I wanted to have some door/window sensors and temperature sensors. I bought the Xiaomi Aqara bridge and implemented all the sensors into Home Assistant. I quickly realized I now have two Zigbee bridges due to the fact that both Philips and Xiaomi want a \u0026lsquo;customized\u0026rsquo; standard. ¯\\_(ツ)_/¯ I needed to do something\u0026hellip;** I was using multiple bridges because I heard some stories about connection losses around some USB bridges.\nAs stability of the platform is crucial to me, I decided to hold back on the purchase until I saw this tweet from Franck: https://twitter.com/Frenck/status/1153634514402521088?s=20 Franck (or @Frenck) is a respected member of the Home Assistant community. I love his contributions to Hass.IO add-ons like the Spotify Connect and the Node-RED add-ons. A recommendation from him means a lot to me, so I decided to take the jump and order the Dresden Elektronic ConBee II stick.\nThis stick is able to integrate with products from Philips Hue, IKEA Trådfri, OSRAM SYLVANIA, Samsung SmartThings, Xiaomi Aqara and many more.\nConfigure the deCONZ Add-on The weekend after the arrival of the gateway, I started with the configuration in Home Assistant \u0026amp; Hass.IO. I installed the deCONZ Hass.IO plugin which was easy-peasy. The deCONZ add-on asks you to configure a device in the config. This is usually something like:\n/dev/ttyAMA0 /dev/ttyUSB0 /dev/ttyACM0 You can get this hardware ID if you go to Hass.IO =\u0026gt; System =\u0026gt; Hardware. In my case, only the following hardware IDs showed up:\n/dev/ttyACM0 /dev/serial/by-id/usb-dresden_elektronik_ingenieurtechnik_GmbH_ConBee_II_DE\u0026lt;*\u0026gt; The first hardware ID didn\u0026rsquo;t work in my case. As this hardware ID is mentioned in the documentation of deCONZ, I thought there must be something wrong. But the second hardware ID worked fine for me. This ID doesn\u0026rsquo;t seem to work with ZHA, so I stick with deCONZ.\nDiscover your devices Now you have configured the add-on. You can click on the Open Web UI button on the deCONZ Hass.IO add-on page.\nThis will show you the Web UI of deCONZ so you can start with the configuration of your devices. I was surprised to see that all my lights, switches and sensors are supported with deCONZ: [gallery ids=\u0026ldquo;2578,2579,2580\u0026rdquo; type=\u0026ldquo;rectangular\u0026rdquo;] So far this setup has been pretty reliable for me.\nIt works stable and the range of the Zigbee Gateway is pretty good.\nSomehow the range is even better than the big Xiaomi Gateway.\nTroubleshooting I did ran into an issue by doing a factory reset of my Philips Hue Light bulbs. You can find step-by-step instructions online about how you can reset your bulbs by using a Hue Dimmer switch, but it didn\u0026rsquo;t work for me as most of the sites are missing one crucial step.\nYou need to follow the steps below very carefully: _First, turn off the light bulb by disconnecting the bulb from power by using a physical light switch on the wall (very important!).\nOtherwise the factory reset might not work, which was driving me nuts as most online instructions don\u0026rsquo;t provide this info._ _Second, turn on the bulb and ensure that all the other bulbs are turned off.\nHold the Hue Dimmer switch close to the bulb.\nPress and hold the 1st (On / I in some countries) and the and 4th (Off / O in some countries) buttons simultaneously for about 10 seconds, until the bulb starts blinking on and off.\nIf the bulb stopped blinking (it will normally turn fully on again after reset) and the green LED on the remote (top left of the remote) blinked green - your bulb is reset.\nYou can now discover the bulb with deCONZ._\nGood bye So long, Philips Hue and Xiaomi Aqara Bridge! ","permalink":"https://devsecninja.com/2019/08/03/bye-philips-hue-xiaomi-bridge/","summary":"\u003cp\u003e**My smart home journey started years ago with a simple Philips Hue configuration.\u003c/p\u003e\n\u003cp\u003eAfter integrating Home Assistant, I wanted to have some door/window sensors and temperature sensors. I bought the Xiaomi Aqara bridge and implemented all the sensors into Home Assistant. I quickly realized I now have two Zigbee bridges due to the fact that both Philips and Xiaomi want a \u0026lsquo;customized\u0026rsquo; standard. ¯\\_(ツ)_/¯ I needed to do something\u0026hellip;** I was using multiple bridges because I heard some stories about connection losses around some USB bridges.\u003c/p\u003e","title":"Bye Philips Hue \u0026 Xiaomi Bridge!"},{"content":"**This week Microsoft quietly released some information about a new Azure solution called Azure Bastion.\nAzure Bastion acts as a gateway between a Virtual Machine in Azure and your session in the Azure Portal.\nThis means that without assigning a Public IP address, you are able to connect to your Azure Virtual Machine through the Azure Portal.\nNo Remote Desktop environment or jumpbox needed.\nAzure Bastion is currently in Private Preview.** Microsoft quietly released two YouTube video\u0026rsquo;s.\nThe first one highlights the Azure Bastion solution in combination with Windows (Server) Virtual Machines over RDP.\nIn this video, you see someone connecting to a Virtual Machine in Azure.\nThe Virtual Machine does not have a Public IP address assigned.\nAfter the presenter clicks on the Connect button, the credentials to the Azure Bastion service must be provided.\nThe Azure Bastion service will now launch a Remote Desktop connection in the browser from the Azure Portal.\nLooking for Azure Bastion in combination with SSH?\nMicrosoft also released a new video on their website that shows Azure Bastion in action with SSH.\nWhy you should use Azure Bastion when it\u0026rsquo;s available In a \u0026rsquo;traditional\u0026rsquo; cloud scenario, you don\u0026rsquo;t want to do your management of the VM over the Public IP address.\nFrom a security perspective, it is recommended to use a machine in front of it.\nThis can be a Remote Desktop Services environment or one or more management VMs.\nThis is what we call a jumpbox or bastion environment.\nAs you can imagine, a jumpbox introduces extra costs such as Azure VM costs but also indirect costs such as server management.\nEspecially if you are leveraging multiple Azure regions and want to incorporate high availability of the jumpboxes.\nAdditionally you need to maintain a separate access list that defines who is allowed to login to the jumpbox.\nAs Azure Bastion is a cloud solution, you don\u0026rsquo;t have to pay for the IaaS resources that you consume and Microsoft is managing the solution for you.\nThe feature is currently in private preview, so no pricing information is available yet.\nBut I assume Microsoft will follow the pay-per-use model, which mostly is significantly cheaper.\nAdditionally, I assume that Microsoft will allow this solution to run independently from an Azure region so you don\u0026rsquo;t have to worry about High Availability and failover.\nAzure Bastion should also simplify your access management, as all cloud solutions incorporate Azure RBAC which again simplifies the way you can assign access to cloud solutions.\nHow Azure Bastion changes how you connect to servers Microsoft is pushing paswordless scenarios as \u0026ldquo;76 % of breaches start with compromised passwords\u0026rdquo; according to Microsoft\u0026rsquo;s Ann Johnson. (SiliconRepublic.com) Microsoft is internally on a paswordless mission and \u0026ldquo;have moved 80 % of internal users away from passwords and we are aggressively moving the other 20 %\u0026rdquo; according to Johnson.\nAzure Bastion changes my vision on an optimal Privileged Identity Management strategy.\nUser account: uses Windows Hello biometric authentication to login passwordless.\nServer admin: uses Azure Bastion to login to Azure VMs by using the local administrator password. This password can be managed by the Local Administrator Password Solution (LAPS) of Microsoft and will be changed every couple of hours. As Microsoft keeps track of everything in the Azure Portal, Audit Logs should provide clarification on who logged in to the machine.\nDomain admin: uses a Privileged Access Workstation/Virtual Machine to work with Domain Controllers.\nGlobal admin: uses Azure AD Privileged Identity Management to request permissions \u0026ldquo;Just-in-Time\u0026rdquo;.\nIt\u0026rsquo;s (seems) fast! The videos released by Microsoft reveals that it\u0026rsquo;s working fast. Of course we have to wait for the public preview to find out in which situations it works fast and when it starts to lag, but it\u0026rsquo;s promising.\nConclusion Azure Bastion seems to be a great addition to the Azure features currently available and will change how organizations are providing access to Virtual Machines in Azure.\nAzure Bastion can be used for both Windows and Linux workloads as RDP and SSH sessions are supported.\nAs Azure Bastion is currently in Private Preview, I wasn\u0026rsquo;t able to find any information of this new solution.\nAs soon there is documentation available from Microsoft, I will update this post. I\u0026rsquo;m really looking forward to the Public Preview of this feature.\nHow do you think that Azure Bastion will change the way you access servers on Azure?\nLet me know in the comments section.\n","permalink":"https://devsecninja.com/2019/06/16/azure-bastion-changes-everything/","summary":"\u003cp\u003e**This week Microsoft quietly released some information about a new Azure solution called Azure Bastion.\u003c/p\u003e\n\u003cp\u003eAzure Bastion acts as a gateway between a Virtual Machine in Azure and your session in the Azure Portal.\u003c/p\u003e\n\u003cp\u003eThis means that without assigning a Public IP address, you are able to connect to your Azure Virtual Machine through the Azure Portal.\u003c/p\u003e\n\u003cp\u003eNo Remote Desktop environment or jumpbox needed.\u003c/p\u003e\n\u003cp\u003eAzure Bastion is currently in Private Preview.** Microsoft quietly released two YouTube video\u0026rsquo;s.\u003c/p\u003e","title":"Azure Bastion changes everything!"},{"content":"**In my last blog post, I introduced you to the solution I\u0026rsquo;m using on Office 365 to create personal e-mail addresses.\nThis solution is hosted on Azure DevOps and is automatically released to the PowerShell Gallery.\nIn this blog post, I will explain briefly how this works.** The Azure DevOps project is hosted in a public repository.\nThis was introduced recently and enables other users of the module to contribute to the project. I started with an empty repository and created a new branch to upload the PowerShell module \u0026amp; manifest file to the repository.\nBuild Who doesn\u0026rsquo;t like a neat PowerShell script? I do! So I always want to make sure that my scripts comply with the rules defined in the PSScriptAnalyzer code checker.\nThe module must pass these rules before it gets uploaded to the PowerShell Gallery. I was able to do this by using an Azure DevOps Build Pipeline.\nThe link redirects you to a successful build that shows you how this works. (Also notice how it grabs the Build ID and uses it as a release version of the module in the PowerShell Gallery!)\nRelease After a successful build, it\u0026rsquo;s time for the release to production! (OK, I might soon add a step or two before releasing to production. 😂) The release grabs the build artifact which contains the module and the manifest, and uploads it to the PowerShell Gallery.\nThe result A fully operational CI/CD pipeline that automatically builds, checks the code and releases it to the PowerShell Gallery! Great stuff! ✌\n","permalink":"https://devsecninja.com/2019/05/20/azure-devops-ci/cd-pipeline-powershell-psscriptanalyzer/","summary":"\u003cp\u003e**In my \u003ca href=\"http://DevSecNinja.com/2019/05/19/my-office-365-based-personal-secure-unique-email-system/\"\u003elast blog post\u003c/a\u003e, I introduced you to the solution I\u0026rsquo;m using on Office 365 to create personal e-mail addresses.\u003c/p\u003e\n\u003cp\u003eThis solution is hosted on Azure DevOps and is automatically released to the PowerShell Gallery.\u003c/p\u003e\n\u003cp\u003eIn this blog post, I will explain briefly how this works.** The \u003ca href=\"https://dev.azure.com/DevSecNinja/Office%20365%20Alias%20Module/\"\u003eAzure DevOps project is hosted in a public repository\u003c/a\u003e.\u003c/p\u003e\n\u003cp\u003eThis was introduced recently and enables other users of the module to contribute to the project. I started with an empty repository and created a new branch to upload the PowerShell module \u0026amp; manifest file to the repository.\u003c/p\u003e","title":"Azure DevOps CI/CD Pipeline, PowerShell \u0026 PSScriptAnalyzer"},{"content":"A secure email system is important to protect yourself from the increasing phishing and spam attacks, where hackers try to steal money on a large scale. More and more organizations hosting personal data are getting hacked, where hackers misuse this personal data to launch attacks. In this blog post I will introduce you to a solution I\u0026rsquo;ve created based on Office 365 and PowerShell to prevent that my email address is spread more widely on the internet.\nRequirements As I always like to start with requirements before building a solution\u0026hellip;\nE-mail addresses/aliases must be revoked when spam or phishing is delivered to the alias. I also want to prevent using aliases like amazon2@domain.com after revoking an alias. Email addresses/aliases must have a unique identifier to make sure that hackers cannot guess the email address for another service. For example, if you use amazon@domain.com a hacker could guess bestbuy@domain.com as well. Also wildcards like John+amazon@domain.com are easy to manipulate with a regex. All email delivered to the email addresses/aliases should be forwarded to one address. I prefer to be able to deliver emails to other addresses as well. For example, if you have an address for your Amazon account, you might want to send emails about delivery to the rest of the family as well. An e-mail address or alias should be active within seconds. The solution must be able to track how much aliases and which aliases are already created. The solution must be easy to use and straightforward. (e.g. prevent the use of a database to keep track of aliases) The e-mail solution must be hosted on a secure email system like Office 365 (preferred) with powerful spam filtering capabilities. The email solution must be fully managed. I don\u0026rsquo;t want to spend time on keeping the system working. These requirements forced me to look into a solution similar to how an API key works. If you want to have access to an API, you can request an API key which is unique to you. The organization that provides the API key stays in full control of the key. When the API is misused by the user, the organization can revoke the API key preventing further access to the API. But how can we achieve this with a secure e-mail provider preventing you from creating less than 150 aliases?\nOptions that crossed my mind My initial thought was to create aliases for my Office 365 account, but after doing some testing I found two major issues:\nIt takes at least 15-20 minutes for an alias to get active and you don\u0026rsquo;t know when it\u0026rsquo;s active. This is really frustrating if you want to register a new e-mail address for a new company or website. Office 365 limits the amount of aliases to \u0026lt;400. Depending on which documentation you read, users are telling that there is a limit on 150 or 200 proxy addresses. Other large email providers do this as well. Solution The solution is focused around Distribution Groups in Office 365. To support the solution, I\u0026rsquo;ve developed a PowerShell module and made it available on the PowerShell Gallery.\nThis resulted in the following scenario I\u0026rsquo;m now actively using.\nThe solution creates email address in the following format: @.. The prefix is used to easily identify email addresses created with this solution.\nThe random number makes it hard for attackers to identify other email addresses in the domain.\nThe subdomain is useful for other Doe family members, so they can also create these aliases if required.\nAn e-mail address for John Doe for Amazon.com looks like: JD30485@j.doe.com\nPrerequisite As a prerequisite I\u0026rsquo;ve created 100 \u0026lsquo;claimable\u0026rsquo; distribution groups which I rename before use. I also installed my PowerShell module on Azure Cloud Shell to make sure I can easily create new email addresses and aliases from any browser or the Azure app. See the last chapter for the module download.\nWhen I create a new account at a website like Amazon.com Let\u0026rsquo;s say I want to register an account with Amazon.com.\nI open the Azure Cloud Shell by browsing to shell.azure.com/powershell. As I already provisioned claimable distribution groups, I can just run the following command: Select-MailAlias -Verbose -DomainName amazon.com This command will change the display name of a claimable distribution group to match the domain name. I provide the personalized email address to Amazon. All emails will be forwarded to my primary mail account. Optionally, I can also include other family members in the distribution group to make sure they will also receive emails from Amazon. Download the module The module is available on the PowerShell Gallery, which can be installed by running the following PowerShell command:\nInstall-Module -Name Office365MailAliases\nThe module contains various cmdlets you can use. Have a look! Want to contribute?\nFeel free to open up a Pull Request!\nThe project is hosted on a public GitHub repository.\nThis project is integrated with a complete CI/CD pipeline that automatically tests, builds and releases to the PowerShell Gallery!\nMore on that in a later blog post.\nLet me know if this is something you will use too! I really like it for my personal email as I already have 20+ e-mail addresses registered.\n","permalink":"https://devsecninja.com/2019/05/19/my-office-365-based-personal-secure-unique-email-system/","summary":"\u003cp\u003e\u003cstrong\u003eA secure email system is important to protect yourself from the increasing phishing and spam attacks, where hackers try to steal money on a large scale. More and more organizations hosting personal data are getting hacked, where hackers misuse this personal data to launch attacks. In this blog post I will introduce you to a solution I\u0026rsquo;ve created based on Office 365 and PowerShell to prevent that my email address is spread more widely on the internet.\u003c/strong\u003e\u003c/p\u003e","title":"My Office 365 based personal, secure \u0026 unique email system"},{"content":"**Microsoft recently announced the Public Preview of the ability to run PowerShell code in an Azure Function.\nThis means that the PowerShell code will run in a Platform-as-a-Service solution, completely serverless!\nYou pay only for the time that you use the solution and you don\u0026rsquo;t have to manage the underlying infrastructure!\nIn this blog post, I will show a practical example of how to use an Azure Function in combination with an Azure Logic App.** After publishing a blog post, I always want to share the post as quickly as possible on Twitter \u0026amp; LinkedIn.\nThis happens automatically by using Twitter on WordPress, and after a day it gets tweeted again by my Azure Logic App solution I created earlier.\nBut this means a blog post is only mentioned two times on Twitter.\nObjective To make sure that my blog posts are getting some more attention, I wanted to randomly publish a blog post once a week.\nAzure Logic App As I like to keep things simple and as I do most of my automation in Azure Logic Apps, I started off with a blank Logic App to gather the RSS feed.\nUnfortunately, I quickly found out that\u0026rsquo;s it\u0026rsquo;s not that easy to get a random item from an array. So why not combine the power of Azure Logic Apps with the new and in preview version of PowerShell in Azure Functions?\nPowerShell in Azure Functions After creating the Azure Function and selecting PowerShell, you start off by creating an HTTP Trigger function:\nThis allows you to trigger the PowerShell Function by using an HTTP request. Opening the trigger shows you the function menu on the left, a PowerShell editor in the middle, a log/console window on the bottom and a test view on the right.\nI\u0026rsquo;m surprised to see how fast I was able to run a script in this new environment.\nIt works quickly as your script will run in about 5 seconds.\nThat is amazingly fast when compared to Azure Automation runbooks.\nIt feels like you have a shell open on your local machine with a little bit delay. I also like both the console and the output field to be able to see what the output will be if you call the HTTP trigger. I\u0026rsquo;m using the following PowerShell script in the Azure Function trigger:\nusing namespace System.Net # Input bindings are passed in via param block. param($Request, $TriggerMetadata) # Write to the Azure Functions log stream. Write-Host \u0026#34;PowerShell HTTP trigger function processed a request.\u0026#34; # Get the feed [xml]$Content = Invoke-WebRequest -Uri \u0026#39;https://cloudenius.com/feed/\u0026#39; $Object = ForEach ($msg in ($Content.rss.channel.Item | Get-Random)){ [PSCustomObject]@{ \u0026#39;Title\u0026#39; = $msg.title \u0026#39;pubDate\u0026#39; = [datetime]$msg.pubdate \u0026#39;Link\u0026#39; = $msg.link } } # Convert the object to JSON $Body = $Object | ConvertTo-Json # Associate values to output bindings by calling \u0026#39;Push-OutputBinding\u0026#39;. Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{ StatusCode = [HttpStatusCode]::OK Body = $Body }) Using the data from the Azure Function in an Azure Logic App Now that we have the data of the random blog post in the Azure Function, we can use it in the Azure Logic App to post the content on Twitter: You can use the following JSON schema to validate the content that comes from the Azure Function:\n{ \u0026#34;properties\u0026#34;: { \u0026#34;Link\u0026#34;: { \u0026#34;type\u0026#34;: \u0026#34;string\u0026#34; }, \u0026#34;Title\u0026#34;: { \u0026#34;type\u0026#34;: \u0026#34;string\u0026#34; }, \u0026#34;pubDate\u0026#34;: { \u0026#34;type\u0026#34;: \u0026#34;string\u0026#34; } }, \u0026#34;type\u0026#34;: \u0026#34;object\u0026#34; } End result Last weekend I\u0026rsquo;ve been working on this solution and on Monday I was happy to see that it worked, as you can see below: https://twitter.com/DevSecNinja/status/1127836495581200384?s=20 Cheers!\n","permalink":"https://devsecninja.com/2019/05/14/use-powershell-in-azure-functions-preview/","summary":"\u003cp\u003e**Microsoft \u003ca href=\"https://devblogs.microsoft.com/powershell/public-preview-of-powershell-in-azure-functions-2-x/\"\u003erecently announced\u003c/a\u003e the Public Preview of the ability to run PowerShell code in an Azure Function.\u003c/p\u003e\n\u003cp\u003eThis means that the PowerShell code will run in a Platform-as-a-Service solution, completely serverless!\u003c/p\u003e\n\u003cp\u003eYou pay only for the time that you use the solution and you don\u0026rsquo;t have to manage the underlying infrastructure!\u003c/p\u003e\n\u003cp\u003eIn this blog post, I will show a practical example of how to use an Azure Function in combination with an Azure Logic App.** After publishing a blog post, I always want to share the post as quickly as possible on Twitter \u0026amp; LinkedIn.\u003c/p\u003e","title":"Use PowerShell in Azure Functions [Preview]"},{"content":"**After passing the AZ-300 exam and being not too happy about the new exam experience, I liked this exam much better.\nLet me explain why.** In my blog post about the AZ-300, I talked about the fact that you can run out of time quickly and that the exam was lacking transparency about the amount and type of questions you still have to do when compared to the time you have left.\nThis AZ-301 was way better. I only had to do 3 cases, 2 at the start of the exam and one at the end.\nThis allowed me to understand how many time I can spend on a case and on the multiple choice questions. I also managed to do this exam in under 120 minutes, which means I had at least 30 minutes on the clock.\nRelevant cases The exam focuses on creating solutions around requirements and considerations. To be able to pass the exam, you need to understand what kind of solutions fit the requirements of the customer.\nEspecially around the Platform-as-a-Service (PaaS) solutions in Azure.\nIn most Microsoft exams, you look at the requirements for the new environment and answer the questions based on that.\nThis time I had to look for the specifications of the current environment as well, to find some pitfalls that affect the solution. (Be aware of this!)\nExam score communication This time I received the \u0026lsquo;Pass\u0026rsquo; score right after the exam while still in the Pearson Vue application, so that is good. In contrast to the AZ-300 exam, I also received the \u0026ldquo;Congratulations on your Microsoft certification!\u0026rdquo; email this time.\nTwo more things\u0026hellip; Pearson Vue ditched the PVProctor application, which you need to use if you sit for the exam at home.\nThat\u0026rsquo;s a great move, because it used to be an installer and based on Adobe Air. (*Yuck*) Pearson Vue now uses a new portable application called OnVUE, which I talked about in my previous blog post as well.\nWhile I absolutely like this new portable application, I\u0026rsquo;ve seen two issues with it.\nThe first one is that the Pearson Vue system check to analyze if your system is ready for the exam, is still based on the old PVProctor application.\nThe second issue is that the OnVUE application does not remove or notify you to remove the old PVProctor application.\nThis means that everyone that used the PVProctor application before, needs to remove the PVProctor application and Adobe Air manually.\nOf course not everyone will frequently check which applications are installed on their machine, so we will see a lot of legacy PVProctor applications in the wild with old and possibly vulnerable Adobe Air instances. (*Sigh*) Hope this gives you some insights into this exam and the exam experience.\nGood luck!\n","permalink":"https://devsecninja.com/2019/05/11/passed-az-301-microsoft-azure-architect-design/","summary":"\u003cp\u003e**After passing the AZ-300 exam \u003ca href=\"http://DevSecNinja.com/2019/05/04/passed-az-300-my-take-on-the-new-microsoft-exams/\"\u003eand being not too happy about the new exam experience\u003c/a\u003e, I liked this exam much better.\u003c/p\u003e\n\u003cp\u003eLet me explain why.** In my \u003ca href=\"http://DevSecNinja.com/2019/05/04/passed-az-300-my-take-on-the-new-microsoft-exams/\"\u003eblog post about the AZ-300\u003c/a\u003e, I talked about the fact that you can run out of time quickly and that the exam was lacking transparency about the amount and type of questions you still have to do when compared to the time you have left.\u003c/p\u003e","title":"Passed AZ-301 - Microsoft Azure Architect Design"},{"content":"This morning I passed the AZ-300 exam. To be honest, I was confident that I failed the exam. Especially because I ran out of time with only 80 - 90 %. In this blog post, I will explain you the good and the bad of this exam and the exam experience.\nThe Good Labs are a great way to test knowledge The labs that I had to take in the exam where stable and solid. You get access to a Windows VM with a browser on it which automatically opens the Azure Portal.\nOther websites are not accessible of course, but you can use the calculator or other Windows components if you want. :) The labs are based on real-world scenarios, combining cases you need to solve through the Azure Portal but also the command line. I had no issues with the performance of the labs, but I would\u0026rsquo;ve appreciated if Microsoft provided more clarification throughout the labs.\nJust to give you an example here.\nThink about a sample question like \u0026ldquo;Create a Windows Server VM with X amount of storage\u0026rdquo;.\nAll kinds of questions are popping up in my head like \u0026ldquo;Does the system know if I place it in a different Resource Group than what\u0026rsquo;s in the subscription?\u0026rdquo; and \u0026ldquo;Can I just create a new VNET, or do I need to use the existing VNET in the subscription?\u0026rdquo;. I recommend Microsoft providing this information before starting the lab. A sentence like \u0026ldquo;If no information is given about the name of the Resource Group and VNET, you should create your own\u0026rdquo; prevents stress there.\nRelevant questions The exam contains relevant questions, but the content is close to the content you need to know for the Certified Azure Administrator exams. I was expecting some more in-depth questions about requirements gathering and the Azure Solutions Architectures. I hope this will be addressed in the AZ-301 exam, which I will sit shortly.\nNew PVProctor application with some great enhancements I always plan to do my exam at home so I can wear comfortable clothes, in my own environment and with my own laptop.\nPearson Vue requires you to install an application called PVProctor on your machine, which is able to launch a secure browser and contains the chat functionality to get in touch with the proctor.\nIt use to run on Adobe Air. (*Yuck*) This application was heavily depending on the built-in webcam of my laptop.\nFor example, I had to take pictures of my ID card and show a 360 degree overview of my room with this webcam.\nThis was my first exam with the new PVProctor application by Pearson Vue, which relies much more on a smartphone.\nBefore taking the exam, the application asks you to browse to a website with your smartphone.\nOn this website, you need to follow a couple of steps like upload some pictures of your workplace but also of your ID card.\nThis methods saves you from taking pictures over and over again because of the bad quality notebook webcams.\nDo note that you will still be proctored by the webcam of your notebook during the exam.\nThe biggest benefit of this new version is that you have access to a virtual whiteboard, which you can use to write down some text.\nThis text will stay there throughout the exam!\nThe Bad Way too much to do in a short period of time Be prepared to use every minute out of the 150 minutes you\u0026rsquo;ve got. I never ran out of time during an exam because I often flag questions that I need to revisit at the end of the exam. This time I did run out of time, with 80 - 90 % of the questions answered.\nLack of transparency to understand how many questions/labs/cases you still need to do The exam can be split into multiple labs, cases and questions. Imagine you have to answer 5 questions in a case, answer 9 multiple choice questions, work through 2 labs before asking some more multiple choice questions again. It\u0026rsquo;s very hard to understand how much time you have left for a lab or for a case. I recommend Microsoft to let the user make the choice when to do the lab, cases or multiple choice questions, or provide insights what\u0026rsquo;s coming up during the exam.\nLabs work fine on a 4K 15\u0026quot; laptop screen But I can imagine that it\u0026rsquo;s a pain with everything below 1920x1080. Especially if you take this exam at an exam center with small monitors.\nNo communication after ending the exam and still haven\u0026rsquo;t received an email This is what still frustrates me. I had to visit the MCP Portal to find out that I passed.\nAfter answering the post-exam questions from Microsoft, the exam stopped without any information and the PVProctor application was stopped.\nPreviously I would\u0026rsquo;ve seen my score at the end of the exam or at least receive an e-mail with my score.\nIt\u0026rsquo;s 12 hours after the start of the exam and I still haven\u0026rsquo;t received any formal communications from Microsoft.\nHope this gives you some insights into this exam.\nGood luck!\n","permalink":"https://devsecninja.com/2019/05/04/passed-az-300-my-take-on-the-new-microsoft-exams/","summary":"\u003cp\u003e\u003cstrong\u003eThis morning I passed the AZ-300 exam. To be honest, I was confident that I failed the exam. Especially because I ran out of time with only 80 - 90 %. In this blog post, I will explain you the good and the bad of this exam and the exam experience.\u003c/strong\u003e\u003c/p\u003e\n\u003ch2 id=\"the-good\"\u003eThe Good\u003c/h2\u003e\n\u003ch3 id=\"labs-are-a-great-way-to-test-knowledge\"\u003eLabs are a great way to test knowledge\u003c/h3\u003e\n\u003cp\u003eThe labs that I had to take in the exam where stable and solid. You get access to a Windows VM with a browser on it which automatically opens the Azure Portal.\u003c/p\u003e","title":"Passed AZ-300 - My take on the new Microsoft exams"},{"content":"**People who follow me on Twitter might have noticed that I\u0026rsquo;m working more and more with Microsoft Flow.\nMicrosoft Flow allows me to create simple automations (like IFTTT) and to create a bridge between services like Office 365 and my home automation with Home Assistant.\nRecent changes to the pricing model made me decide to move away from Microsoft Flow, back to Azure Logic Apps.\nIn this blog post, I\u0026rsquo;ll explain how easy it is to move your flows to Azure Logic Apps.** In my flows I\u0026rsquo;ve been a heavy user of the HTTP Trigger and HTTP Request actions, because I\u0026rsquo;m relying mostly on REST API calls to perform the automation tasks.\nMicrosoft has suddenly decided that a premium account is needed to use these actions.\nAs I understand that Microsoft Flow uses compute power in the background and costs money to run, I think Microsoft should have done better by:\nCommunicating that these triggers \u0026amp; request will require a premium account in X amount of months. Microsoft suddenly changed this in Microsoft Flow and forced me to migrate my Flows to Azure Logic Apps as quickly as possible to be able to create new flows or edit current flows. I do appreciate that Microsoft kept the already created flows running. Allowing a Microsoft account (MSA) to be registered as a premium account by somehow linking it to an Office 365 account. As I was using a Microsoft account and therefore not able to upgrade to a Microsoft Flow premium account, I was basically stuck. https://twitter.com/DevSecNinja/status/1113068148171964416?s=20\nMigrate your Microsoft Flows to Azure Logic Apps As Microsoft Flow is using the same back end as Azure Logic Apps, they made it very easy to migrate to Azure Logic Apps. Before you follow this guide, make sure you have an active Azure subscription that you can use for the Azure Logic Apps.\nBrowse to the Microsoft Flow Portal. Login with your Microsoft Flow account. On the left-hand side, click on My Flows. On a flow that you want to migrate to Azure Logic Apps, click the More Commands button, select Export and click Logic Apps Template (.json). This will create a backup of the Microsoft Flow in JSON. Feel free to open up the file with Visual Studio Code or Notepad to see how the flows are structured. If you are in Visual Studio Code, you can make the JSON more human-readable by formatting the document. You can do this by using the SHIFT + ALT + F shortcut. You can also open the Command Palette (CTRL + SHIFT + P) and type Format Document. Browse to the Azure Portal and search for Deploy a custom template in the search bar. This will open up the Custom deployment blade. As we already have a JSON template, click on Build your own template in the editor. On the right-hand side of the screen, paste the contents of the JSON file you\u0026rsquo;ve downloaded at step 2. This can be the same version as you\u0026rsquo;ve downloaded at this step or the formatted document. Click on the Save button to load the template. The next screen allows you to: Select the subscription where the Azure Logic App will be created. Select the resource group of the Azure Logic App. Select the Azure region of the Azure Logic App. Provide the name \u0026amp; location of the Azure Logic App. (You can keep the location as [resourceGroup().location] to add the Azure Logic App to the same Resource Group as the deployment. Provide values for additional parameters, like the name of the connectors if your export contains additional connectors like the Outlook connector. Clicking on the Purchase button after you agreed with the terms will start the deployment. In a couple of minutes, your Microsoft Flow flow is now running as an Azure Logic App! Make sure to test the Azure Logic App by browsing to the Resource Group you\u0026rsquo;ve just created (CloudeniusApp in the example) and trigger the Logic App. If your Logic App is using connectors, you might need to add them to the Azure Logic App. After successful testing the Logic App, you can remove the flow from Microsoft Flows. That\u0026rsquo;s it! You are now using Azure Logic Apps instead of Microsoft Flow!\nWant to learn more about Azure Logic Apps? Check out the following blog post: Use Azure Logic Apps and RSS to Create a Simple Post Reminder on Social Media.\n","permalink":"https://devsecninja.com/2019/04/06/migrate-to-azure-logic-apps-from-microsoft-flow/","summary":"\u003cp\u003e**People who follow me on Twitter might have noticed that I\u0026rsquo;m working more and more with Microsoft Flow.\u003c/p\u003e\n\u003cp\u003eMicrosoft Flow allows me to create simple automations (like IFTTT) and to create a bridge between services like Office 365 and my home automation with Home Assistant.\u003c/p\u003e\n\u003cp\u003eRecent changes to the pricing model made me decide to move away from Microsoft Flow, back to Azure Logic Apps.\u003c/p\u003e\n\u003cp\u003eIn this blog post, I\u0026rsquo;ll explain how easy it is to move your flows to Azure Logic Apps.** In my flows I\u0026rsquo;ve been a heavy user of the HTTP Trigger and HTTP Request actions, because I\u0026rsquo;m relying mostly on REST API calls to perform the automation tasks.\u003c/p\u003e","title":"Migrate to Azure Logic Apps from Microsoft Flow"},{"content":"**Since a couple of months I\u0026rsquo;ve been using Microsoft Teams as my daily driver. This means that I\u0026rsquo;m fully migrated from Skype for Business to Microsoft Teams.\nOne of the things I\u0026rsquo;m missing is notifications without the content of the message. (also called message preview) Especially when I\u0026rsquo;m presenting my screen during a meeting, I don\u0026rsquo;t want the meeting participants to read my Teams messages.\nDisabling notifications and relying on the taskbar icon didn\u0026rsquo;t work for me, as I\u0026rsquo;ve missed several messages per week.\nIn this post, I\u0026rsquo;ll show you how I used Python to create a toast message when I get a new message from Teams.**\nPython?! Why not use PowerShell?! You can definitely do this with PowerShell. I think it would\u0026rsquo;ve saved me some time to create it in PowerShell. As I\u0026rsquo;m on a journey to learn new languages, I like to keep challenging myself to do things in other languages.\nHow the Python script works As I couldn\u0026rsquo;t find something a Teams client API that I could use with Python, I\u0026rsquo;m using the log files of the Teams client to discover the state of the client.\nIt\u0026rsquo;s not something I would recommend for a mass roll-out, but it does the job for me until Microsoft fixes this. (Vote for this UserVoice please!) Also please be aware that this is my second Python script and it might require improvement. :) It loops through the logs.txt file and searches for the \u0026ldquo;Available -\u0026gt; NewActivity\u0026rdquo; string.\nIf it finds that string, it means that the Teams client is receiving a message. (E.g. a private message or a message in a channel).\nThe next action that the script takes is to create a toast message by using the ToastNotifier module.\nScript contents \u0026amp; usage View the repository of the script on GitHub and feel free to contribute by opening a Pull Request!\n","permalink":"https://devsecninja.com/2019/03/15/microsoft-teams-notifications-without-message-preview-in-python/","summary":"\u003cp\u003e**Since a couple of months I\u0026rsquo;ve been using Microsoft Teams as my daily driver. This means that I\u0026rsquo;m fully migrated from Skype for Business to Microsoft Teams.\u003c/p\u003e\n\u003cp\u003eOne of the things I\u0026rsquo;m missing is notifications without the content of the message. (also called message preview) Especially when I\u0026rsquo;m presenting my screen during a meeting, I don\u0026rsquo;t want the meeting participants to read my Teams messages.\u003c/p\u003e\n\u003cp\u003eDisabling notifications and relying on the taskbar icon didn\u0026rsquo;t work for me, as I\u0026rsquo;ve missed several messages per week.\u003c/p\u003e","title":"Microsoft Teams notifications without message preview in Python"},{"content":"This morning I passed both the TOGAF 9 Part 1 \u0026amp; Part 2 to become TOGAF Certified! According to TOGAF, this \u0026ldquo;is to provide validation that, in addition to the knowledge and comprehension of TOGAF 9 Foundation, the Candidate is able to analyze and apply this knowledge\u0026rdquo;. As I\u0026rsquo;ve spent a large amount of effort and time into this, I\u0026rsquo;m really happy with this result! Read this post if you want to learn more about TOGAF and taking the exam.\nWhat is TOGAF and how do I use it? The TOGAF Standard is an open Enterprise Architecture from The Open Group. It is used by leading organizations to improve business efficiency. As a Solutions Architect it allows me to better align with the Enterprise Architecture of the organization. Especially when the organization adopted TOGAF as their architecture framework, since that allows the various architects within the organization to use a common communication language.\nHow to become TOGAF 9 Certified? Becoming TOGAF 9 Certified requires you to pass both the TOGAF 9 Foundation (Part 1) and TOGAF 9 Certified (Part 2) exams.\nYou can combine both exams which requires you to take both exams at once.\nYou have 60 minutes to complete Part 1 of the exam and 90 minutes to complete Part 2.\nIf you live in one of the ESL Countries specified on the TOGAF site, you get an additional 30 minutes for Part 1 and 30 minutes for Part 2.\nThe TOGAF 9 Certification Pyramid. Source\nPrepare for the TOGAF 9 Exams Within Avanade Netherlands, 3 TOGAF Certified architects gave training to a group of participants from various departments.\nThis is a great way to understand where TOGAF fits into my daily work and which phases you are connecting with colleagues from other departments. I highly recommend this way of learning for TOGAF because the training material is very \u0026lsquo;static\u0026rsquo; and hard to understand if you cannot make the link with your daily work. (A big thanks to our great trainers!) Besides this training, I was on a project which required a 3 hour drive per day.\nThis allowed me to listen to the TOGAF Part 1 \u0026amp; Part 2 courses from Scott Duffy.\nWhile this course is not enough to pass the exam (as Scott also states during the course), it gives you a nice overview and helps you to get familiar with all the phases in the ADM and the different terms used across TOGAF.\nBesides that, it\u0026rsquo;s not too expensive. I payed around 20 EUR for both courses combined.\nLast but not least, you need to spend a lot of time with the TOGAF 9 Study Guides.\nMake sure you use the practice tests in the book after each chapter.\nFlag chapters where you made too much mistakes or you\u0026rsquo;re not familiar with.\nGo through the book again and make notes.\nMake the practice tests after each chapter again.\nFeeling confident about the content?\nMake sure you take sufficient time between the tests to make sure you\u0026rsquo;re not remembering the question and answer combinations.\nIf you are confident, take the practice tests from Appendix B. I scored 32 and 33 points for both practice tests, which told me that I should be ready to take the exam.\nAccording to the book, the required points to pass the practice exam are higher in the practice test when compared to the actual exam.\nTo me, the actual exam was more difficult than the practice exams in the book.\nRepeat these steps for the Part 2 exam.\nWhile each chapter doesn\u0026rsquo;t contain a practice exam, you can read through the book and try the practice exams in the Appendix. I scored 33 points, where 28 points are needed to pass the exam.\nGo for the Part 1 \u0026amp; Part 2 combined? Yes, for sure. I spent most of my time on Part 1. Part 2 is an open book exam. During the exam, you get access to the complete TOGAF Standard document, which contains over 900+ pages. As long as you understand Part 1 and you are able to carefully read the cases, you should be able to pass this one as well.\nTake the TOGAF 9 exams Taking the combined Part 1 and Part 2 exam requires some more effort. You need to stay focused for 3 - 4 hours, which can be difficult in a noisy exam center. (I still remember the noise from the AC!) Also be aware that there is no break between the two exams and you will not receive your score after taking the Part 1 exam.\nAfter the last question of Part 1, Part 2 automatically starts.\nTake the exam at the time that you are most comfortable with.\nTo me, that is early in the morning. (I was the first one in the exam center today, so I could find the spot I like)\nPart 2 Exam Tip For every question, note down \u0026ldquo;1A, 1B, 1C, 1D\u0026rdquo; on your paper and look for the most unlikely answer.\nThis answer is worth 0 points.\nYou don\u0026rsquo;t want to select that one as it\u0026rsquo;s a distractor.\nAfter finding the distractor, look for the 2 most likely answers.\nCarefully study the answers and validate them in the book.\nIs the answer following the correct steps according to the TOGAF Standard?\nGo look into the TOGAF Standard document and use the search functionality.\nFinished the exam?\nIt will not tell you that you\u0026rsquo;ve passed.\nYou have to go back to the proctor which will get your result from the printer.\nAfter you know your result, it can take up to 6 days before The Open Group confirms your exam via email.\nGood luck!\nLet me know in the comments if this was useful to you!\n","permalink":"https://devsecninja.com/2019/03/14/passed-the-togaf-9-certified-exams/","summary":"\u003cp\u003e\u003cstrong\u003eThis morning I passed both the TOGAF 9 Part 1 \u0026amp; Part 2 to become TOGAF Certified! According to TOGAF, this \u0026ldquo;is to provide validation that, in addition to the knowledge and comprehension of TOGAF 9 Foundation, the Candidate is able to analyze and apply this knowledge\u0026rdquo;. As I\u0026rsquo;ve spent a large amount of effort and time into this, I\u0026rsquo;m really happy with this result! Read this post if you want to learn more about TOGAF and taking the exam.\u003c/strong\u003e\u003c/p\u003e","title":"Passed the TOGAF 9 Certified exams!"},{"content":"In this blog post, I will show you how to create an RSS Feed/WordPress Post Reminder with Azure Logic Apps, which you can use to post to various Social Media platforms like Twitter or Facebook. If you\u0026rsquo;ve been following me on Twitter, you\u0026rsquo;ve probable seen a post like this with a reminder to check out my yesterday\u0026rsquo;s blog post: https://twitter.com/DevSecNinja/status/1084815812232388613?s=20 This is all works automatically by using Azure and Azure Logic Apps.\nAzure Logic Apps is just pulling the RSS Feed of my blog, waiting for 24 hours and publishing a tweet on Twitter.\nIt\u0026rsquo;s really easy to set up as most blog sites and Content Management Systems (CMS) have a built-in RSS-feed.\nLet me show you how to set this up.\nRequirements An Azure Subscription I\u0026rsquo;m using a Visual Studio Enterprise license, but a trial or Pay-As-You-Go Subscription will do just fine. Based on the cost history, this Azure Logic App construction seems to cost around € 0,00 and € 0,08. An RSS Feed from a site like WordPress To get the URL to your RSS feed, try yourdomain.com/feed or feed.xml. My WordPress.com blog is hosting an RSS feed on https://DevSecNinja.com/feed/. A Social Media account like Twitter You need to setup a connection between Azure Logic Apps and Twitter. Just follow the instructions in the blog post to get you started. Quick Start Tutorial Login on the Azure Portal. Click on Create a resource and search the Marketplace for Logic App. Click on Logic App. A new blade will open. Click the Create button. Specify: Name - e.g. RSSFeedPostReminder or WordPressPostReminder Subscription - select your Azure subscription Resource Group - create a new one called \u0026ldquo;LogicApps\u0026rdquo; or use an existing. Location - specify the region near you. E.g. West Europe Keep the Log Analytics switch to Off Click Create. The Logic App will be provisioned for you which can take a couple of minutes. Click on the bell icon to see the status of the deployment in the Notifications section: When the Logic App has been provisioned successfully, click on it. This will open the Logic App. Click the Edit button. This will open the Logic Apps Designer. You can now add new steps to your Logic App. In my case, I have the following steps defined: \u0026ldquo;When a feed item is published\u0026rdquo; - When a new post is available on my blog \u0026ldquo;Delay\u0026rdquo; - Wait for 1 day \u0026ldquo;Post a tweet\u0026rdquo; - Post a tweet on Twitter and use the featured image of my blog post. This should look something like this: Be creative! Think about also posting to Facebook or Microsoft Teams. Or maybe you want to send your blog posts to newsletter subscribers. It\u0026rsquo;s all possible. Enhance your Azure Logic Apps skills? Check out the Azure Logic Apps website and documentation on Microsoft Docs. Thanks for reading my blog post. I hope that this post helped you by creating a new Logic App. Let me know if you have other cool steps to share! ","permalink":"https://devsecninja.com/2019/01/20/use-azure-logic-apps-and-rss-to-create-a-simple-post-reminder-on-social-media/","summary":"\u003cp\u003e\u003cstrong\u003eIn this blog post, I will show you how to create an RSS\u003c/strong\u003e \u003cstrong\u003eFeed/WordPress Post Reminder with Azure Logic Apps, which you can use to post to various Social Media platforms like Twitter or Facebook.\u003c/strong\u003e If you\u0026rsquo;ve been following me on Twitter, you\u0026rsquo;ve probable seen a post like this with a reminder to check out my yesterday\u0026rsquo;s blog post: \u003ca href=\"https://twitter.com/DevSecNinja/status/1084815812232388613?s=20\"\u003ehttps://twitter.com/DevSecNinja/status/1084815812232388613?s=20\u003c/a\u003e This is all works automatically by using Azure and Azure Logic Apps.\u003c/p\u003e","title":"Use Azure Logic Apps and RSS to Create a Simple Post Reminder on Social Media"},{"content":"This year already started great for me by passing the AZ-102: Microsoft Azure Administrator Certification Transition exam!\nThis transition exam is based on content from the AZ-100 and AZ-101 exams.\nIn this blog post, I will share some tips that really helped me passing the exam. https://twitter.com/DevSecNinja/status/1080433831667265541?s=20 As I\u0026rsquo;m working with Azure a lot during my work as an Azure Architect/Engineer, the platform is just too big to be able to master everything.\nMy first advice is to work through an online course just to see which content you understand and which content needs more attention. I can recommend this course on Udemy from Nick Colyer.\nWhat I like about his course is that Nick walks you through every topic and clearly explains what you need to know before he digs into the Azure Portal or CLI to show you how it\u0026rsquo;s done.\nAlso there are little to no intros and outros, which saves you time.\nMake sure you write notes during his courses and also note down when you need to pay some extra attention on a subject.\nAfter finishing his courses, I went back to my notes and made sure that I spent enough time in my Azure lab with the subjects that need more action.\nIf you don\u0026rsquo;t have an Azure lab, you can get a free trial with $200 credits which is plently as long as you shutdown your VMs or remove the resources after your tests.\nAlways search for changes Now it\u0026rsquo;s time to find out the gaps between what you know and what you need to know to pass the exam. If you search Google for the following, you\u0026rsquo;ll see that Microsoft currently hasn\u0026rsquo;t updated the exam. That doesn\u0026rsquo;t mean it will not get updated after my blog post, so always do a quick search:\n\u0026ldquo;AZ-102\u0026rdquo; \u0026ldquo;changes\u0026rdquo; site:*microsoft.com filetype:pdf\nThis will search for \u0026ldquo;AZ-102\u0026rdquo; and \u0026ldquo;changes\u0026rdquo; on sites ending with microsoft.com with a filetype of PDF, as Microsoft used to publish this content as PDFs. If you cannot find a PDF from the Microsoft site, you can just follow the objectives from the Microsoft website.\nMy 2 cents During my exam I found out that I underestimated some subjects. Make sure you know how Azure Migrate works and if you cannot play around with it, make sure you know it\u0026rsquo;s limitations, how it works and what is supported.\nMake also sure you know how you need to setup SSPR and that you know how to configure various Conditional Access Policies.\nLast but not least, know how to use Azure Site Recovery and when you use Logic Apps, Azure Functions, Event Grids and Service Bus.\nEspecially setting up Service Bus was something that I should\u0026rsquo;ve spent more time on.\nIf you are planning to take the exam, please let me know how it went and if this blog post helped you by passing the exam.\nThank you!\n","permalink":"https://devsecninja.com/2019/01/13/passed-microsoft-az-102-azure-administrator-cert-transition-exam/","summary":"\u003cp\u003eThis year already started great for me by passing the \u003ca href=\"https://www.microsoft.com/en-us/learning/exam-az-102.aspx\"\u003eAZ-102: Microsoft Azure Administrator Certification Transition exam\u003c/a\u003e!\u003c/p\u003e\n\u003cp\u003eThis transition exam is based on content from the AZ-100 and AZ-101 exams.\u003c/p\u003e\n\u003cp\u003eIn this blog post, I will share some tips that really helped me passing the exam. \u003ca href=\"https://twitter.com/DevSecNinja/status/1080433831667265541?s=20\"\u003ehttps://twitter.com/DevSecNinja/status/1080433831667265541?s=20\u003c/a\u003e As I\u0026rsquo;m working with Azure a lot during my work as an Azure Architect/Engineer, the platform is just too big to be able to master everything.\u003c/p\u003e","title":"Passed Microsoft AZ-102: Azure Administrator Cert Transition exam"},{"content":"The Dutch Government is aiming on providing smart meters to every household before Q4 2020. All the smart meters need to comply to DSMR (Dutch Smart Meter Requirements). DSMR allows us to read data from the smart meter by using a cable. In this guide, I will explain how I got this to work with Home Assistant.\nPrerequisites Home Assistant (Hass.io) running on a Raspberry Pi One of the supported smart meters. I\u0026rsquo;m using the Landis+Gyr ZMF110 with DSMR 4.2 from Liander A cable to connect from USB on your Pi to the smart meter (Optionally) a second Raspberry Pi running on a Linux distro to send the data over the network to your Hass.io Pi Connect your Raspberry Pi to the smart meter Insert the USB cable in the Raspberry Pi and the other side of the cable into your smart meter. A data connection should now be established.\nOptional: Use Ser2Net on another Raspberry Pi if your Pi isn\u0026rsquo;t close to the smart meter I have two Pi\u0026rsquo;s and only one of the Pi is close to the smart meter, while the other Pi hosts the Hass.io image. Luckily there is a great Linux package called Ser2Net. This tool will make the serial connection available over the network. Install and configure it with the following commands:\nsudo apt-get update sudo apt-get install ser2net sudo nano /etc/ser2net.conf sudo service ser2net restart At step 3, make sure you grab the example code from the official Home Assistant component page and save the file. Please note that I assume that the USB cable is connected to /dev/ttyUSB0. If that\u0026rsquo;s not the case, change this value to the correct port.\nEnable the DSMR component in Home Assistant Open your Configuration.yaml file or your Sensors.yaml file and specify the following configuration if the Smart Meter is directly connected to USB:\n# Example configuration.yaml entry for USB/serial connected Smartmeter sensor: - platform: dsmr port: /dev/ttyUSB1 dsmr_version: 5 If you are using the Ser2Net on another Pi, use the following configuration:\nDon\u0026rsquo;t forget to check the DSMR version on your smart meter and the USB port on the Raspberry Pi. Now all the sensors will be available in Home Assistant. You might want to structure this as a group in the configuration.yaml file or like me in a separate groups.yaml file:\nmeter_readings: name: Meter readings entities: - sensor.power_consumption - sensor.power_consumption_low - sensor.power_consumption_normal - sensor.power_consumption_phase_l1 - sensor.power_consumption_phase_l2 - sensor.power_consumption_phase_l3 - sensor.power_production - sensor.power_production_low - sensor.power_production_normal - sensor.power_production_phase_l1 - sensor.power_production_phase_l2 - sensor.power_production_phase_l3 - sensor.power_tariff - sensor.voltage_sags_phase_l1 - sensor.voltage_sags_phase_l2 - sensor.voltage_sags_phase_l3 - sensor.voltage_swells_phase_l1 - sensor.voltage_swells_phase_l2 - sensor.voltage_swells_phase_l3 - sensor.long_power_failure_count - sensor.gas_consumption - sensor.hourly_gas_consumption Update 1: I\u0026rsquo;ve been playing around with my dashboard in Home Assistant this weekend. I now have the following view in my dashboard: Screenshot of Meter Readings view in Home Assistant\nUpdate 2: Something I forgot to mention in the blog post: if you use the ser2net package mentioned in the blog post, make sure you allow access to the port (e.g. 2001) from Home Assistant and deny access to any other device. This can be done by using a firewall like ufw on Linux. Otherwise your smart meter data is visible to anyone on the (local) network. Big thanks to everyone who made this possible in the Home Assistant Community! Cheers.\n","permalink":"https://devsecninja.com/2018/08/04/use-smart-energy-meter-with-home-assistant/","summary":"\u003cp\u003e\u003cstrong\u003eThe Dutch Government is aiming on providing smart meters to every household before Q4 2020. All the smart meters need to comply to DSMR (Dutch Smart Meter Requirements). DSMR allows us to read data from the smart meter by using a cable. In this guide, I will explain how I got this to work with Home Assistant.\u003c/strong\u003e\u003c/p\u003e\n\u003ch2 id=\"prerequisites\"\u003ePrerequisites\u003c/h2\u003e\n\u003cul\u003e\n\u003cli\u003eHome Assistant (Hass.io) running on a Raspberry Pi\u003c/li\u003e\n\u003cli\u003eOne of the \u003ca href=\"https://www.home-assistant.io/components/sensor.dsmr/\"\u003esupported smart meters\u003c/a\u003e. I\u0026rsquo;m using the Landis+Gyr ZMF110 with DSMR 4.2 from Liander\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://www.sossolutions.nl/slimme-meter-kabel\"\u003eA cable\u003c/a\u003e to connect from USB on your Pi to the smart meter\u003c/li\u003e\n\u003cli\u003e(Optionally) a second Raspberry Pi running on a Linux distro to send the data over the network to your Hass.io Pi\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"connect-your-raspberry-pi-to-the-smart-meter\"\u003eConnect your Raspberry Pi to the smart meter\u003c/h2\u003e\n\u003cp\u003eInsert the USB cable in the Raspberry Pi and the other side of the cable into your smart meter. A data connection should now be established.\u003c/p\u003e","title":"Use Smart Energy Meter with Home Assistant"},{"content":"My last blog post was all about getting Hass.io (or HassIO) installed on the new Raspberry Pi 3 Model B+.\nThis guide starts right where we left off: configuring Home Assistant to work with the configuration files we already have from Home Assistant running on Raspbian.\nBelow are the steps I took in a nutshell.\nInstall and open the Configurator Add-on on Hass.io to make sure you can always open the web UI to change your configurations. Create a snapshot so you can always go back to this point in time. Cut/paste the BaseURL and SSL settings from the configuration.yaml on your old Pi to the new configuration.yaml on your new Pi by using the Configurator add-on. Make sure that you have an SSH session open to the old Pi on the IP address of the old Pi, so you can still copy/paste the contents of various configurations. Also stop the Home Assistant service on your old Raspberry Pi and change any port forwarding rules in your firewall or DNS settings. (Depending on your old setup) Got your Home Assistant ready under the original URL? Create a new snapshot, just to be sure. Start copying the contents of your configuration.yaml and other relevant YAML configurations by grabbing it from SSH and pasting it in the Configurator Add-on. I was surprised to see that all the modules I\u0026rsquo;ve used before on Raspbian are working fine on Hass.io! So don\u0026rsquo;t worry too much about that. Go to your Hass.io URL and confirm the dashboard is back to where it was before. Let me know if this guide helped you out! Cheers!\n","permalink":"https://devsecninja.com/2018/08/04/configuring-the-new-home-assistant-hass.io-64-bit-image-on-a-raspberry-pi-3-model-b/","summary":"\u003cp\u003eMy \u003ca href=\"http://DevSecNinja.com/2018/07/20/installing-the-new-home-assistant-hass-io-64-bit-image-on-a-raspberry-pi-3-model-b/\"\u003elast blog post\u003c/a\u003e was all about getting Hass.io (or HassIO) installed on the new Raspberry Pi 3 Model B+.\u003c/p\u003e\n\u003cp\u003eThis guide starts \u003ca href=\"http://DevSecNinja.com/2018/07/20/installing-the-new-home-assistant-hass-io-64-bit-image-on-a-raspberry-pi-3-model-b/\"\u003eright where we left\u003c/a\u003e off: configuring Home Assistant to work with the configuration files we already have from Home Assistant running on Raspbian.\u003c/p\u003e\n\u003cp\u003eBelow are the steps I took in a nutshell.\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eInstall and open the Configurator Add-on on Hass.io to make sure you can always open the web UI to change your configurations.\u003c/li\u003e\n\u003cli\u003eCreate a snapshot so you can always go back to this point in time.\u003c/li\u003e\n\u003cli\u003eCut/paste the BaseURL and SSL settings from the configuration.yaml on your old Pi to the new configuration.yaml on your new Pi by using the Configurator add-on. Make sure that you have an SSH session open to the old Pi on the IP address of the old Pi, so you can still copy/paste the contents of various configurations. Also stop the Home Assistant service on your old Raspberry Pi and change any port forwarding rules in your firewall or DNS settings. (Depending on your old setup)\u003c/li\u003e\n\u003cli\u003eGot your Home Assistant ready under the original URL? Create a new snapshot, just to be sure.\u003c/li\u003e\n\u003cli\u003eStart copying the contents of your configuration.yaml and other relevant YAML configurations by grabbing it from SSH and pasting it in the Configurator Add-on. I was surprised to see that all the modules I\u0026rsquo;ve used before on Raspbian are working fine on Hass.io! So don\u0026rsquo;t worry too much about that.\u003c/li\u003e\n\u003cli\u003eGo to your Hass.io URL and confirm the dashboard is back to where it was before.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eLet me know if this guide helped you out! Cheers!\u003c/p\u003e","title":"Configuring the new Home Assistant Hass.io 64-bit image on a Raspberry Pi 3 Model B+"},{"content":"On Twitter I asked the following to David James (Director of Engineering, ConfigMgr, Microsoft) and Johan Arwidmark (CTO @ TrueSec): https://twitter.com/DevSecNinja/status/1024927840138145793 For example, I have 3 device collections in SCCM that I call: \u0026ldquo;Windows 10 Feature Updates - Test\u0026rdquo; \u0026ldquo;Windows 10 Feature Updates - Pre-Production\u0026rdquo; \u0026ldquo;Windows 10 Feature Updates - Production\u0026rdquo; With ADRs, that\u0026rsquo;s quite simple.\nJust add the deployment to the Software Update Group in SCCM and you\u0026rsquo;re done.\nBut I was wondering if that is supported in the Servicing Plan scenario too, as with a Servicing Plan you define the amount of days it will take after a build release, before SCCM will deploy the feature update to the collection.\nOption 1: Multiple Servicing Plans, one Software Update Group (Automated) Rob York (Program Manager for Configuration Manager at Microsoft) joined the discussion and he pointed out that you can create multiple Servicing Plans and select an existing update package. Once the update package has been downloaded, it will not be re-downloaded. Proof is in the image below. https://twitter.com/robdotyork/status/1025049053699235840\nOption 2: One Servicing Plan, multiple deployments (Manual) Chris Roberts pointed out that you can also use the console to quickly deploy the Servicing Plan to a new collection. Using this method, you have more control over the timing when a feature update gets pushed to a new device collection. https://twitter.com/young_robbo/status/1025084944291385345?s=21\nThat escalated quickly\u0026hellip; If you open the first tweet of this blog post, you see we had some interesting discussions about when to use a Servicing Plan and a Task Sequence.\nJohan is not a big fan of the Servicing Plan model: https://twitter.com/jarwidmark/status/1025078067981688836 While David James explains what we will see with 1810: https://twitter.com/djammmer/status/1025137270934757376 Nick Wiley has a valid point here: https://twitter.com/npwiley/status/1025167790980980738?s=21 There are some more great tweets in there!\nConclusion According to me, Windows Servicing gives the best user experience, is easy/quick to setup and is the next step towards Windows as a Service. If you need perform various tasks before or directly after the upgrade, Task Sequences are still required. Let me know what you prefer: Task Sequences or Windows Servicing in the comments section. Cheers!\n","permalink":"https://devsecninja.com/2018/08/04/windows-servicing-plans-vs-task-sequences/","summary":"\u003cp\u003eOn Twitter I asked the following to David James (Director of Engineering, ConfigMgr, Microsoft) and Johan Arwidmark (CTO @ TrueSec): \u003ca href=\"https://twitter.com/DevSecNinja/status/1024927840138145793\"\u003ehttps://twitter.com/DevSecNinja/status/1024927840138145793\u003c/a\u003e For example, I have 3 device collections in SCCM that I call: \u0026ldquo;Windows 10 Feature Updates - Test\u0026rdquo; \u0026ldquo;Windows 10 Feature Updates - Pre-Production\u0026rdquo; \u0026ldquo;Windows 10 Feature Updates - Production\u0026rdquo; With ADRs, that\u0026rsquo;s quite simple.\u003c/p\u003e\n\u003cp\u003eJust add the deployment to the Software Update Group in SCCM and you\u0026rsquo;re done.\u003c/p\u003e\n\u003cp\u003eBut I was wondering if that is supported in the Servicing Plan scenario too, as with a Servicing Plan you define the amount of days it will take after a build release, before SCCM will deploy the feature update to the collection.\u003c/p\u003e","title":"Windows Servicing Plans vs Task Sequences"},{"content":"Because of the recent domain change of my blog, I decided to completely start over again with my lab in Azure. I\u0026rsquo;ve been working with Desired State Configuration Configs in Azure for quite a few years, but never used them for my own lab environment.\nIt felt a bit overkill to do that, but now I wanted to start over and do everything right.\nThis step-by-step installation guide explains how to create a DSC Configuration in Azure Automation and how to apply this on your domain controller in Azure.\nPrerequisites Azure Automation Account Including these credentials: DomainAdminCredential SafeModePassword (Pick a random username) Including these variables: DomainName - Contains the domain name for Active Directory (e.g. mylab.DevSecNinja.com) Including these modules from the Modules Gallery: xActiveDirectory xStorage xPendingReboot An Azure Virtual Machine that will be the domain controller, with: A data disk Ensure you disable caching on both disks as that\u0026rsquo;s required for domain controllers running in Azure Installation Guide - Step-by-Step Configuring Azure DSC Ensure you\u0026rsquo;ve followed the prerequisites steps above, as we depend on it during the installation guide. Create the following folder structure on your machine: DSC DSC_AD_Domain Copy/Paste Script 1 - DSC_AD_Domain.ps1 from below and save it in the DSC_AD_Domain folder. Open the Azure Portal, open Automation Accounts and click on the Automation Account you\u0026rsquo;ve created based on the prerequisites. Open DSC Configurations, click the Add Configuration button and upload the DSC_AD_Domain.ps1 script. After importing the script, you should now see it in your DSC Configurations. Click on the script and click on the Compile button: Wait for a couple of minutes for the compilation to complete. Go back to the Azure Portal and ensure that the last compilation has completed successfully: Good job! You\u0026rsquo;ve compiled the DSC Configuration. We can now apply this configuration to a Virtual Machine in Azure. Apply the DSC Configuration to your Virtual Machine Browse and open your Automation Account in the Azure Portal. Click the DSC Nodes tab. Click the Add Azure VM button. Select the Virtual Machine you want to manage with DSC and click the Connect button: The Registration Blade will open. Make sure you select the new DSC Configuration you\u0026rsquo;ve recently uploaded. Also ensure you select \u0026ldquo;ApplyAndAutoCorrect\u0026rdquo; as the Configuration Mode. DSC will now take care of the configuration and ensure it will stay compliant.\nIt\u0026rsquo;s very important to select \u0026ldquo;Reboot Node if Needed\u0026rdquo;, otherwise the domain creation process cannot proceed as it will need a reboot there.\nBe careful with this setting in production though.\nClick the OK button so Azure can do it\u0026rsquo;s magic.\n6. Go back to the DSC Nodes tab and wait for the system to show up there. Also keep an eye on the status of the machine there. It\u0026rsquo;s a wrap! Thanks for reading this blog post. Let me know if you have any questions in the comments section. Cheers!\nScript 1 - DSC_AD_Domain.ps1 This script was originally written by Michael Green and modified by me.\nconfiguration DSC_AD_Domain { # Import the modules needed to run the DSC script Import-DscResource -ModuleName \u0026#39;xActiveDirectory\u0026#39; Import-DscResource -ModuleName \u0026#39;xStorage\u0026#39; Import-DscResource -ModuleName \u0026#39;xPendingReboot\u0026#39; Import-DscResource -ModuleName \u0026#39;PSDesiredStateConfiguration\u0026#39; # When using with Azure Automation, modify these values to match your stored credential names $DomainAdminCredential = Get-AutomationPSCredential -Name \u0026#39;DomainAdminCredential\u0026#39; $SafeModePassword = Get-AutomationPSCredential -Name \u0026#39;SafeModePassword\u0026#39; $DomainName = Get-AutomationVariable -Name \u0026#39;DomainName\u0026#39; # Configuration node localhost { WindowsFeature ADDSInstall { Ensure = \u0026#39;Present\u0026#39; Name = \u0026#39;AD-Domain-Services\u0026#39; } xWaitforDisk Disk2 { DiskId = 2 RetryIntervalSec = 10 RetryCount = 30 } xDisk DiskF { DiskId = 2 DriveLetter = \u0026#39;F\u0026#39; DependsOn = \u0026#39;[xWaitforDisk]Disk2\u0026#39; } xPendingReboot BeforeDC { Name = \u0026#39;BeforeDC\u0026#39; SkipCcmClientSDK = $true DependsOn = \u0026#39;[WindowsFeature]ADDSInstall\u0026#39;,\u0026#39;[xDisk]DiskF\u0026#39; } xADDomain Domain { DomainName = $DomainName DomainAdministratorCredential = $DomainAdminCredential SafemodeAdministratorPassword = $SafeModePassword DatabasePath = \u0026#39;F:\\NTDS\u0026#39; LogPath = \u0026#39;F:\\NTDS\u0026#39; SysvolPath = \u0026#39;F:\\SYSVOL\u0026#39; DependsOn = \u0026#39;[WindowsFeature]ADDSInstall\u0026#39;,\u0026#39;[xDisk]DiskF\u0026#39;,\u0026#39;[xPendingReboot]BeforeDC\u0026#39; } Registry DisableRDPNLA { Key = \u0026#39;HKEY_LOCAL_MACHINE\\SYSTEM\\CurrentControlSet\\Control\\Terminal Server\\WinStations\\RDP-Tcp\u0026#39; ValueName = \u0026#39;UserAuthentication\u0026#39; ValueData = 0 ValueType = \u0026#39;Dword\u0026#39; Ensure = \u0026#39;Present\u0026#39; DependsOn = \u0026#39;[xADDomain]Domain\u0026#39; } } } ","permalink":"https://devsecninja.com/2018/07/22/use-powershell-dsc-and-azure-automation-to-create-an-active-directory-domain/","summary":"\u003cp\u003eBecause of the recent \u003ca href=\"http://DevSecNinja.com/2018/07/19/welcome-to-cloudenius-com/\"\u003edomain change of my blog\u003c/a\u003e, I decided to completely start over again with my lab in Azure. I\u0026rsquo;ve been working with Desired State Configuration Configs in Azure for quite a few years, but never used them for my own lab environment.\u003c/p\u003e\n\u003cp\u003eIt felt a bit overkill to do that, but now I wanted to start over and do everything right.\u003c/p\u003e\n\u003cp\u003eThis step-by-step installation guide explains how to create a DSC Configuration in Azure Automation and how to apply this on your domain controller in Azure.\u003c/p\u003e","title":"Use PowerShell DSC and Azure Automation to Create an Active Directory Domain"},{"content":"Home Assistant recently announced a brand new image of Hass.IO running on HassOS. I instantly ordered a new Raspberry Pi 3 Model B+ to replace my older Raspberry Pi Model B, which was running Raspbian and Home Assistant. The guide below helps you with installing your new Hass.IO instance!\nPrerequisites Raspberry Pi 3 Model B+ 32 GB **Micro-**SD card (I ordered a SanDisk Micro-SDHC 32GB Extreme U3 100MB/s) Ethernet cable (if not using WiFi) Micro USB power supply (2.1 A - very important! I\u0026rsquo;m using an iPad charger) A Windows 10 machine (This guide might work on a MacBook Pro or a Linux distro too) Installation Guide - Step-by-Step Download the 64-bit version of Hass.IO from Hass.IO. (File was named hassos_rpi3-64-1.7.img.gz) Download and install Etcher. I\u0026rsquo;ve used the Win32DiskImager tool before, but just installed Etcher to ensure that I\u0026rsquo;m following the right process. Open Etcher, open the image file you just downloaded and select the SD card. Hit the Flash button! Wait until it\u0026rsquo;s completed. It took me a couple of minutes. Optional - If you want to setup WiFi or set a static IP: On a USB stick, create the network/my-network file and follow the howto from HassOS. Safely remove the SD card (and optional USB stick) from the computer. Insert the SD card (and optional USB stick) into the Raspberry Pi and turn it on. Make sure you connect the LAN cable if you don\u0026rsquo;t use WiFi! On first boot, it downloads the latest version of Home Assistant which takes about 20 minutes. This gives you enough time to remove Etcher from your PC if you don\u0026rsquo;t need it anymore. You can now check your router/DHCP server to find the IP address of the Raspberry Pi. You can also reach the Raspberry Pi at http://hassio.local:8123. In the next blog post, we will take about configuring Home Assistant in Hass.io with the new Raspberry Pi 3 Model B+. Cheers!\n","permalink":"https://devsecninja.com/2018/07/20/installing-the-new-home-assistant-hass.io-64-bit-image-on-a-raspberry-pi-3-model-b/","summary":"\u003cp\u003eHome Assistant recently \u003ca href=\"https://www.home-assistant.io/blog/2018/07/11/hassio-images/\"\u003eannounced a brand new image of Hass.IO running on HassOS\u003c/a\u003e. I instantly ordered a new Raspberry Pi 3 Model B+ to replace my older Raspberry Pi Model B, which was running Raspbian and Home Assistant. The guide below helps you with installing your new Hass.IO instance!\u003c/p\u003e\n\u003ch2 id=\"prerequisites\"\u003ePrerequisites\u003c/h2\u003e\n\u003cul\u003e\n\u003cli\u003eRaspberry Pi 3 Model B+\u003c/li\u003e\n\u003cli\u003e32 GB **Micro-**SD card (I ordered a SanDisk Micro-SDHC 32GB Extreme U3 100MB/s)\u003c/li\u003e\n\u003cli\u003eEthernet cable (if not using WiFi)\u003c/li\u003e\n\u003cli\u003eMicro USB power supply \u003cstrong\u003e(2.1 A - very important! I\u0026rsquo;m using an iPad charger)\u003c/strong\u003e\u003c/li\u003e\n\u003cli\u003eA Windows 10 machine (This guide might work on a MacBook Pro or a Linux distro too)\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"installation-guide--step-by-step\"\u003eInstallation Guide - Step-by-Step\u003c/h2\u003e\n\u003col\u003e\n\u003cli\u003eDownload \u003ca href=\"https://www.home-assistant.io/hassio/installation/\"\u003ethe 64-bit version of Hass.IO from Hass.IO\u003c/a\u003e. (File was named hassos_rpi3-64-1.7.img.gz)\u003c/li\u003e\n\u003cli\u003eDownload and install \u003ca href=\"https://etcher.io/\"\u003eEtcher\u003c/a\u003e. I\u0026rsquo;ve used the Win32DiskImager tool before, but just installed Etcher to ensure that I\u0026rsquo;m following the right process.\u003c/li\u003e\n\u003cli\u003eOpen Etcher, open the image file you just downloaded and select the SD card.\u003c/li\u003e\n\u003cli\u003eHit the Flash button! \u003cimg alt=\"Shows the Etcher application with the image and SDHC card selected\" loading=\"lazy\" src=\"/images/2018/07/etcher-1.png\"\u003e\u003c/li\u003e\n\u003cli\u003eWait until it\u0026rsquo;s completed. It took me a couple of minutes.\u003c/li\u003e\n\u003cli\u003eOptional - If you want to setup WiFi or set a static IP: On a USB stick, create the \u003ccode\u003enetwork/my-network\u003c/code\u003e file and follow the \u003ca href=\"https://github.com/home-assistant/hassos/blob/dev/Documentation/network.md\"\u003ehowto from \u003c/a\u003e \u003ca href=\"https://github.com/home-assistant/hassos/blob/dev/Documentation/network.md\"\u003eHassOS\u003c/a\u003e.\u003c/li\u003e\n\u003cli\u003eSafely remove the SD card (and optional USB stick) from the computer.\u003c/li\u003e\n\u003cli\u003eInsert the SD card (and optional USB stick) into the Raspberry Pi and turn it on. Make sure you connect the LAN cable if you don\u0026rsquo;t use WiFi! On first boot, it downloads the latest version of Home Assistant which takes about 20 minutes.\u003c/li\u003e\n\u003cli\u003eThis gives you enough time to remove Etcher from your PC if you don\u0026rsquo;t need it anymore.\u003c/li\u003e\n\u003cli\u003eYou can now check your router/DHCP server to find the IP address of the Raspberry Pi. You can also reach the Raspberry Pi at \u003ca href=\"http://hassio.local:8123/\"\u003ehttp://hassio.local:8123\u003c/a\u003e.\u003c/li\u003e\n\u003c/ol\u003e\n\u003cp\u003eIn the next blog post, we will take about configuring Home Assistant in Hass.io with the new Raspberry Pi 3 Model B+. Cheers!\u003c/p\u003e","title":"Installing the new Home Assistant Hass.IO 64-bit image on a Raspberry Pi 3 Model B+"},{"content":"Today I\u0026rsquo;m happy to tell you that my blog is now on the Cloudenius.com domain!\nSwitching again? Less than a year ago, I changed the domain name of the blog from jvrtech.net to jvr.cloud. Unfortunately the domain name (jvr.cloud) has never been greatly accepted by Google. Searching for jvr cloud shows the JVR Cloud iOS app which hasn\u0026rsquo;t been updated for a while:\nSearching for jvr.cloud gave the following search results, which are better but still not good:\nWhat about the old domain name (jvr.cloud)? It\u0026rsquo;s a shame I have to give up this domain name for the blog, as I loved it. JVR.Cloud will link to this site for at least a year, so make sure you update any bookmarks. Cheers!\n","permalink":"https://devsecninja.com/2018/07/19/welcome-to-cloudenius.com/","summary":"\u003cp\u003eToday I\u0026rsquo;m happy to tell you that my blog is now on the Cloudenius.com domain!\u003c/p\u003e\n\u003ch2 id=\"switching-again\"\u003eSwitching again?\u003c/h2\u003e\n\u003cp\u003eLess than a year ago, I \u003ca href=\"http://jvr.cloud/2017/12/02/new-domain-name-welcome-to-jvr-cloud/\"\u003echanged the domain name of the blog from jvrtech.net to jvr.cloud\u003c/a\u003e. Unfortunately the domain name (jvr.cloud) has never been greatly accepted by Google. Searching for jvr cloud shows the JVR Cloud iOS app which hasn\u0026rsquo;t been updated for a while:\u003c/p\u003e\n\u003cp\u003e\u003cimg alt=\"chrome_2018-07-09_17-16-52.png\" loading=\"lazy\" src=\"/images/2018/07/chrome_2018-07-09_17-16-52.png\"\u003e Searching for jvr.cloud gave the following search results, which are better but still not good:\u003c/p\u003e","title":"Welcome to Cloudenius.com!"},{"content":"**With Windows 10 1607, Microsoft introduced Dual Scan functionality, which allows the computer to connect with Microsoft Updates besides using WSUS or SCCM. Steve Henry from Microsoft: \u0026ldquo;It is for the enterprise that wants WU to be its primary update source while Windows Server Update Services (WSUS) provides all other content.\u0026rdquo; I\u0026rsquo;ve seen various blog posts not covering all the steps I had to take to ensure Windows only looks to SCCM/WSUS.\nEspecially covering Windows 10 deployments with System Center - Configuration Manager.**\nStep 1: Answer File During the Out of the Box Experience (OOBE) of Windows 10, Windows will look for updates.\nAs Group Policies are not available at the OOBE stage, it will grab them from the Microsoft Update servers. To ensure this doesn\u0026rsquo;t happen, set the ProtectYourPC setting in your answer file to 3. The update message will still pop-up during OOBE, but it will not download the updates:\n3\nStep 2: Registry After the computer comes out of OOBE, the user is able to login to the system. Normally it can take around 10 minutes or longer before the SCCM Client receives its policies and populates the WSUS Server in the registry.\nIf a user opens Settings and clicks on \u0026ldquo;Updates \u0026amp; Security\u0026rdquo;, followed by \u0026ldquo;Check for Updates\u0026rdquo;, Windows will actually start looking for updates with Microsoft Updates.\nAt some organizations, a technician will finish OOBE for the user and login for the first time to check for updates.\nIf the technician does that accidentally by the \u0026ldquo;Check for Updates\u0026rdquo; button instead of using the SCCM Client, the machine will get it\u0026rsquo;s updates from Microsoft.\nTherefore, ensure that you run the PowerShell code below during your Task Sequence.\nYou can keep the WUServer and WUStatusServer registry keys empty as the SCCM Client will set them automatically after the SCCM policy gets applied on the system.\n# Add required registry keys for Windows Updates try { $WURegistryKeyPath = \u0026#34;HKLM:\\SOFTWARE\\Policies\\Microsoft\\Windows\\WindowsUpdate\u0026#34; # Add registry value UseWUServer - 1 to enable WSUS, 0 for Microsoft Update Write-Output \u0026#34;Adding HKEY_LOCAL_MACHINE\\Policies\\Microsoft\\Windows\\WindowsUpdate\\AU\\UseWUServer value as DWORD with data 1\u0026#34; New-ItemProperty -Path (Join-Path -Path $WURegistryKeyPath -ChildPath \u0026#34;AU\u0026#34;) -Name UseWUServer -PropertyType DWORD -Value 1 # Add registry value WUServer - to allow the SCCM client to fill in the WSUS Server Write-Output \u0026#34;Adding HKEY_LOCAL_MACHINE\\Policies\\Microsoft\\Windows\\WindowsUpdate\\WUServer value as String with data \u0026#34; New-ItemProperty -Path $WURegistryKeyPath -Name WUServer -PropertyType String -Value \u0026#34;\u0026#34; # Add registry value WUStatusServer - to allow the SCCM client to fill in the WSUS Server Write-Output \u0026#34;Adding HKEY_LOCAL_MACHINE\\Policies\\Microsoft\\Windows\\WindowsUpdate\\WUStatusServer value as String with data \u0026#34; New-ItemProperty -Path $WURegistryKeyPath -Name WUStatusServer -PropertyType String -Value \u0026#34;\u0026#34; } catch [System.Exception] { Write-Output \u0026#34;An error occured while configuring Windows Updates\u0026#34; } Step 3: Group Policies Set policy \u0026ldquo;Do not connect to any Windows Update Internet locations\u0026rdquo; to Enabled in your Group Policy. This will ensure that Windows will not use Windows Update Internet locations.\nSet policy \u0026ldquo;Do not allow update deferral policies to cause scans against Windows Update\u0026rdquo; to Enabled in your Group Policy. This policy is new and might require an update of your ADMX templates. If you don\u0026rsquo;t want to import the ADMX templates, you can also use the registry which you can find on this great Group Policy Search site.\nSet policy \u0026ldquo;Select when Feature Updates are received\u0026rdquo; and \u0026ldquo;Select when Quality Updates are received\u0026rdquo; to \u0026ldquo;Not configured\u0026rdquo; in your Group Policy. If you use the Microsoft Security Baseline or CIS Benchmarks, remove this policy from your baseline. Even setting this policy to disabled in your baseline will enable Dual Scan.\nSet policy \u0026ldquo;Do not include drivers with Windows Updates\u0026rdquo; to Enabled in your Group Policy. Occasionally Windows will include drivers in Windows Updates. If you want full control of your drivers, set this policy.\nStep 4: Validate! On a Windows 10 machine, open Settings and click on \u0026ldquo;Updates \u0026amp; Security\u0026rdquo;. Click on \u0026ldquo;View installed update history\u0026rdquo; and ensure this is empty after a fresh deployment. Browse back to the Updates \u0026amp; Security menu and check for updates. Again, ensure no updates are getting installed. I advise to keep an eye on the updates that are getting applied on your system as I found my findings at step 1 and 2 after my initial tests. Open PowerShell and run the following command:\n(New-Object -ComObject \u0026#34;Microsoft.Update.ServiceManager\u0026#34;).Services | select Name, IsDefaultAUService Ensure that the IsDefaultAUService parameter for the \u0026ldquo;Windows Server Update Service\u0026rdquo; has the value of True. Open PowerShell and run the following command:\nGet-WindowsUpdateLog Examine the log to ensure Windows is not connecting with Microsoft Updates. Hope you find this useful. Cheers!\n","permalink":"https://devsecninja.com/2018/04/21/update-windows-10-with-sccm/wsus-only-by-defeating-dual-scan/","summary":"\u003cp\u003e**With Windows 10 1607, Microsoft introduced Dual Scan functionality, which allows the computer to connect with Microsoft Updates besides using WSUS or SCCM. \u003ca href=\"https://blogs.technet.microsoft.com/wsus/2017/05/05/demystifying-dual-scan/\"\u003eSteve Henry from Microsoft\u003c/a\u003e: \u0026ldquo;It is for the enterprise that wants WU to be its primary update source while Windows Server Update Services (WSUS) provides all other content.\u0026rdquo; I\u0026rsquo;ve seen various blog posts not covering all the steps I had to take to ensure Windows only looks to SCCM/WSUS.\u003c/p\u003e","title":"Update Windows 10 with SCCM/WSUS only by defeating Dual Scan"},{"content":"**This error message is related to Device Guard Code Integrity in Windows 10 and shows up in the Event Viewer under the Code Integrity folder.\nAs of writing this article, the error message is not described in online documentation of Microsoft.** In the current scenario, the built-in Windows 10 apps like the Calculator, Alarms \u0026amp; Clock or the Photos app will instantly crash after opening it.\nThis error message tells that the sysfer.dll is not trusted by Microsoft and therefore cannot interfere with the Alarms \u0026amp; Clock app.\nSysfer.dll is the driver that Symantec Endpoint Protection uses with their Application \u0026amp; Device Control feature to control which processes are allowed to run.\nCode Integrity determined that a process (\\Device\\HarddiskVolume3\\Program Files\\WindowsApps\\Microsoft.WindowsAlarms_10.1709.2621.1000_x64__8wekyb3d8bbwe\\Time.exe) attempted to load \\Device\\HarddiskVolume3\\Windows\\System32\\sysfer.dll that did not meet the Store signing level requirements.\nThere are a couple of workarounds to \u0026lsquo;solve\u0026rsquo; the issue:\nUse Device Guard Code Integrity: exclude the C:\\Program Files\\WindowsApps\\* location from the Application \u0026amp; Device Control feature. If that doesn\u0026rsquo;t work (which I\u0026rsquo;m currently looking into), remove the Application \u0026amp; Device Control feature from the Symantec installation. Use Symantec Application \u0026amp; Device Control: disable User Mode Code Integrity by removing the \u0026ldquo;Enabled:UMCI\u0026rdquo; part in the CI Policy. Even though Symantec states that they are fully supporting Code Integrity, I don\u0026rsquo;t see how that\u0026rsquo;s going to work out because Device Guard doesn\u0026rsquo;t trust the Symantec driver. I also tried to remove the \u0026ldquo;Required:Enforce Store Applications\u0026rdquo; option from the CI Policy which didn\u0026rsquo;t fix the issue.\nEven if you add the driver to your Code Integrity Policy which runs in Audit mode, it will not work.\nOtherwise it would show the following error message:\nSource: Microsoft Docs \u0026ldquo;.. or violated code integrity policy. However, due to code integrity auditing policy, the image was allowed to load.\u0026rdquo; Hope this explains the error message and the options you have. Cheers!\n","permalink":"https://devsecninja.com/2018/04/21/driver-did-not-meet-the-store-signing-level-requirements-windows-10-code-integrity/","summary":"\u003cp\u003e**This error message is related to Device Guard Code Integrity in Windows 10 and shows up in the Event Viewer under the Code Integrity folder.\u003c/p\u003e\n\u003cp\u003eAs of writing this article, the error message is not described in online documentation of Microsoft.** In the current scenario, the built-in Windows 10 apps like the Calculator, Alarms \u0026amp; Clock or the Photos app will instantly crash after opening it.\u003c/p\u003e\n\u003cp\u003eThis error message tells that the sysfer.dll is not trusted by Microsoft and therefore cannot interfere with the Alarms \u0026amp; Clock app.\u003c/p\u003e","title":"*driver* did not meet the Store signing level requirements - Windows 10 Code Integrity"},{"content":"**With the Windows 10 Creators Update, Microsoft introduced Windows AutoPilot. Windows AutoPilot is a service which allows users to enroll their device with the Intune/Azure AD tenant of the organization during the Out-of-the-Box (OOBE) experience of Windows 10.\nBy using Windows AutoPilot, organizations can dramatically decrease the time needed to configure a new device.\nDuring Microsoft Tech Summit 2018 in Amsterdam, Michael Niehaus announced some exciting new features which I will discuss in this blog post too.**\nDeployment Scenario First a very short introduction on the deployment scenario. With Windows AutoPilot, the following deployment scenario now becomes reality:\nAs a new employee, John joins the Contoso organization in 4 weeks. Contoso creates a user account for John and will also order a new Surface Book 2 with their preferred OEM, Microsoft. John will receive the Surface Book 2 notebook at home 1 week before his first day at Contoso. John will boot up the device, is prompted to login with the Contoso organization and has access to all the necessary resources like applications and portals. This step enrolls John\u0026rsquo;s device with Azure AD and Intune so that the device is managed by the organization. This deployment scenario involves no IT personnel while the user can start using the device within 15 minutes!\nOEM Support When you buy a device with one of the supported OEMs, the device are automatically added to the Windows AutoPilot service and ready for deployment. No manual registration needed! The following OEMs support Windows AutoPilot:\nMicrosoft (Surface) - Now Lenovo (end of March) Hewlett Packard - HP (first half of CY2018) Dell (second half of CY2018) Panasonic (second half of CY2018) Fujitsu (second half of CY2018) Toshiba (second half of CY2018) Configuration Windows AutoPilot can be configured from the Azure Portal if you follow these steps. One of the last steps is to create a Windows AutoPilot Deployment profile. Currently, the settings here are quite limited as you can see below:\nWindows AutoPilot Deployment Profile Settings in the Azure Portal\nNew features announced during Tech Summit I see a lot of potential in Windows AutoPilot, especially because Microsoft released some new details during last Microsoft Tech Summit in Amsterdam. (Slide Deck not available yet - but found his slide deck from Expert Live US on the web)\nHybrid Azure AD Join The slide below describes it all. In the near future, you can join devices to Active Directory as well, by using an Offline Domain Join connector and a VPN connection! In theory this means you can now manage those devices with Active Directory (GPOs etc.) and SCCM too! This is a brilliant idea and will allow a lot of organizations to use Windows AutoPilot now while migrating to a modern workplace.\nHybrid Azure AD Join with Windows AutoPilot (Source)\nCaptive Portal support Some organizations have Guest WiFi with a captive portal where the user needs to agree on terms. This will be supported in the OOBE. I requested this years ago with Microsoft, so great to see it will be there soon.\nPlug and Forget With Plug and Forget, Microsoft delivers a deployment solution for devices like scoring boards or information screens at airports. The technician can plug the device into ethernet/power and it automatically self-deploys itself into the Windows AutoPilot service. Cool stuff.\nFeature requests for Windows AutoPilot Pre-defined keyboard layout As OEMs are already capturing the information to join a device into the Windows AutoPilot service, I would like to see that OEMs also forward keyboard layout information to Microsoft so users don\u0026rsquo;t have to select a keyboard lay-out. In the Netherlands, most people are using the United States International instead of the Dutch keyboard lay-out. So if the OEM already knows which keyboard is on the device, why not pass that along to Microsoft?\nWindows 10 Keyboard Layout selection in OOBE (Source)\nInstall applications or needed binaries at the factory In some scenarios, organizations are limited on network bandwidth. Shipping a device with the binaries of the applications already on the disk would increase both installation speed and adoption at low-bandwidth locations.\nCustom options during or at the end of OOBE Think about showing a (web)page to a user with content on how a user can enroll with certain services or where they can find their applications. Or get rid of the install and configuration documentation by showing a manual from the organization. I\u0026rsquo;m really excited to see the new features for Windows AutoPilot! Let me know what you think in the comments! Cheers, Jean-Paul\n","permalink":"https://devsecninja.com/2018/03/31/windows-autopilot-new-features/","summary":"\u003cp\u003e**With the Windows 10 Creators Update, Microsoft introduced Windows AutoPilot. Windows AutoPilot is a service which allows users to enroll their device with the Intune/Azure AD tenant of the organization during the Out-of-the-Box (OOBE) experience of Windows 10.\u003c/p\u003e\n\u003cp\u003eBy using Windows AutoPilot, organizations can dramatically decrease the time needed to configure a new device.\u003c/p\u003e\n\u003cp\u003eDuring Microsoft Tech Summit 2018 in Amsterdam, Michael Niehaus announced some exciting new features which I will discuss in this blog post too.**\u003c/p\u003e","title":"Windows AutoPilot - New Features"},{"content":"Recently my Xbox Achievements stopped popping up for Forza Motorsport 7 as described before in this post.\nAchievements for other games where working fine but after a while also Battlefield 1 achievements where not popping up anymore.\nAfter quitte some troubleshooting, I found out that the Pi-Hole Ad-Block DNS service installed on my Raspberry Pi was blocking the achievements traffic to the Xbox Servers.\nThe strange thing is that Microsoft is using the same server addresses for telemetry data!\nSo blocking telemetry data also blocks achievements, while all other Microsoft services are working fine.\nEven the Network Tester on the Xbox told me that the connection was working fine.\nAfter I changed the DNS server settings on the Xbox to the Google DNS Servers (8.8.8.8 and 8.8.4.4), the achievements started to pop-up slowly.\nOne achievement in Forza triggered all other achievements as well resulting in 9 achievement unlocks at the same time.\nYou can follow this guide on the Microsoft Support site to change the DNS Settings on your Xbox One X: Network settings on Xbox One.\nLet me know if this solved your issues too.\nCheers!\n","permalink":"https://devsecninja.com/2018/03/03/xbox-one-achievements-not-working/","summary":"\u003cp\u003eRecently my Xbox Achievements stopped popping up for Forza Motorsport 7 \u003ca href=\"http://cloudenius.com/2017/12/31/xbox-one-x-switching-from-ps4-pro/\"\u003eas described before in this post\u003c/a\u003e.\u003c/p\u003e\n\u003cp\u003eAchievements for other games where working fine but after a while also Battlefield 1 achievements where not popping up anymore.\u003c/p\u003e\n\u003cp\u003eAfter quitte some troubleshooting, I found out that the Pi-Hole Ad-Block DNS service installed on my Raspberry Pi was blocking the achievements traffic to the Xbox Servers.\u003c/p\u003e\n\u003cp\u003eThe strange thing is that Microsoft is using the same server addresses for telemetry data!\u003c/p\u003e","title":"Xbox One Achievements Not Working"},{"content":"Happy New Year all! A new workplace is a great way to start the new year. As some of you may know, I\u0026rsquo;ve moved to a new apartment in 2016. My new apartment has a quite large living room but no extra room for a workplace. I work full time at Avanade and I like to work from home every now and then. I\u0026rsquo;m also aiming on finishing my Bachelor studies in Q1 2019, so all this requires a good workplace.\nRequirements The workplace needs to fit into my modern living room style. The workplace needs to be comfortable and adjustable. The workplace needs to have enough screen real estate and enough power outlets. Based on these requirements I\u0026rsquo;ve started to look for a Standing Desk. This gives me the flexibility to stand and to stretch my legs.\nStudies tell me that sitting is the new smoking, so I wanted to invest a bit more in a Standing Desk.\nIn my search I\u0026rsquo;ve found some local dealers selling \u0026lsquo;basic\u0026rsquo; Standing Desks with a white table top and a metallic frame.\nThis is fine in a workplace, but it doesn\u0026rsquo;t fit in my living room.\nMost non-standard Standing Desks are only available in the US or Canada.\nThe only standing desk that was reviewed by major review sites was available from IKEA.\nBut the IKEA desk is quite standard and doesn\u0026rsquo;t seem very reliable.\nJarvis Standing Desk A couple of months later I found out that Jaswig in Belgium started shipping the Jarvis Standing Desks with bamboo tops.\nThis was exactly what I wanted! A modern frame which is rock solid, a good looking table top and number 1 pick by The Wirecutter. I\u0026rsquo;ve started a conversation with Henri from Jaswig and a couple of days later, the Standing Desk arrived.\nIt was well packed and took me a couple of hours to install.\nThe instruction manual was understandable. I was amazed by the quality of all the parts - especially the bamboo table top.\nAt first I was a bit skeptical on the extra contour (the curve) that the table top has. I was thinking that a rectangle table top was better, but now I\u0026rsquo;m convinced that his is a great benefit over a rectangle top.\nLook at this!\nAccessories The Jarvis WireTamers are recommended. They are made of thick plastic and coated with the same color as the frame.\nDownside is that there are no pre-drilled holes for the WireTamer, so I had to drill that hole by myself with a screwdriver in the table top. I placed a multi outlet in the WireTamer and drilled a hole in the wall so it will end up in my wall cabinet.\nThe Jarvis Tabletop comes with two standard grommet holes, which can be filled with something like the Bachmann ELEVATOR.\nThe Elevator is available with multiple inputs as you can see below. I bought the one with two power outlets as I didn\u0026rsquo;t want to have outlets that will be obsolete soon.\nThey are available on Amazon.\nJaswig also sells monitor arms which matches the color of the Jarvis frame. They sell different versions of the monitor arm - even with a notebook stand. The stand works great and I\u0026rsquo;m happy with the built in cable management.\nMonitor A good workplace contains a screen which is adjustable. This can be a notebook on a notebook stand or a nice monitor. I like to work with larger screens, so I wanted to have something which is 27+ inch.\nMy work notebook, a Dell Precision 5510, contains a beautiful 4K screen.\nBecause I wanted to have one large screen, I went with the AOC U3277FWQ which has an MVA panel and can be mounted on the Jarvis Monitor Arm.\nAfter checking the monitor on dead pixels, I tried to unscrew the monitor from it\u0026rsquo;s old stand. 1 screw done, 2nd screw was a bit stuck, 3rd screw done, 4th screw also stuck. AARGH! I used the correct screw head but unfortunately two of the screws were stuck.\nSo I contacted Coolblue - the store were I bought the monitor. I was hoping they could provide me with some help as I would imagine they would have experience with this.\nThey offered me to swap the monitor for a new one! I wasn\u0026rsquo;t expecting that, but I\u0026rsquo;m sure they know how to make customers smile.\nBecause I was convinced the screws were too tight due to the screw gun used by AOC, there was a big chance that this could happen with a device from the same batch.\nSo I asked Coolblue to swap the device for another model.\nAfter reading several reviews, I went with the LG 27UD69.\nThe price of this monitor was a bit higher but the quality of the IPS panel is fantastic!\nThe downside of using a beautiful notebook screen is that most MVA and VA panels - like the AOC - require some \u0026lsquo;adjustment time\u0026rsquo;. :) So after all I\u0026rsquo;m happy with this swap, despite the smaller screen size.\nThe LG screen is made of much more high quality materials.\nThe result My new workplace in 2018! Let me know if you like it in the comment section! ","permalink":"https://devsecninja.com/2018/01/02/my-new-modern-workplace-happy-new-year/","summary":"\u003cp\u003eHappy New Year all! A new workplace is a great way to start the new year. As some of you may know, I\u0026rsquo;ve moved to a new apartment in 2016. My new apartment has a quite large living room but no extra room for a workplace. I work full time at Avanade and I like to work from home every now and then. I\u0026rsquo;m also aiming on finishing my Bachelor studies in Q1 2019, so all this requires a good workplace.\u003c/p\u003e","title":"My New Modern Workplace - Happy New Year!"},{"content":"**As an Xbox 360 fanatic, I switched to PC gaming in 2011. After moving to a new house in 2016, there was no room for Triple Screen PC Gaming. I wanted a new and stable platform with 4K capabilities.\nThe PS4 Pro released with 4K support and I wanted to see what they had to offer.\nDuring that time, rumors were telling me that the Xbox Scorpio wasn\u0026rsquo;t going to release soon.\nWhen the Xbox One X - Scorpio Edition was available for pre-order a year later, I decided to switch.\nIn this blog post, I\u0026rsquo;ll explain why and what my current experiences are.**\nPlayStation 4 Pro No cloud or automatic backups I\u0026rsquo;m just a simple console gamer without the need for online gaming. On PlayStation, that means you\u0026rsquo;re losing a lot of features. Most importantly to me: games will not be backed up to the Cloud. So if your PlayStation hard disk crashes, you lose all your save games. Backing up a PlayStation 4 is a pain. It takes quite some time, requires reboots and you can only perform a manual backup!\nNo game downloads in standby As I don\u0026rsquo;t have a Fiber connection at home and internet over cable is quite expensive, my download and upload speed is limited. I have enough bandwidth to stream 4K Netflix which is fine, until you need to download 4K games. PlayStation can download the games in stand-by, but requires an online subscription too.\nXbox One X This is a big benefit for Microsoft, because they offer Cloud backups and game downloads in standby without a monthly subscription! I absolutely love the Xbox One X.\nFirst of all it\u0026rsquo;s really (really!) quiet and offers beautiful 4K footage. NBA 2K18 is one of my favorite games and the graphics of the courts and most players are incredible!\nAfter installing the mandatory updates, I\u0026rsquo;ve started to download the footage needed for NBA 2K18 and went out for some food.\nAfter I came back, the installation was finished and I started playing.\nIt was such a great experience!\nStream Xbox content with the Xbox App This is a killer feature from Microsoft! I use an Intel NUC DN2820FYKH with a slow Intel Celeron processor, 8 GB of RAM and Windows 10 as an HTPC in the bedroom. It\u0026rsquo;s fine for simple TV, NBA or Netflix streaming. I was surprised to see it is able to stream 1080P content from my Xbox One X in the living room! The Xbox Controller stays connected to the console in the living room next to the bedroom without any lag. I can even start the Xbox with the Xbox app and directly start playing games!\nController The controller that comes with the Xbox is a personal thing, but I\u0026rsquo;m happy with it as I like the lay-out of the thumb sticks and the feedback I get from the bumpers.\nBatteries can be easily replaced with new ones as I use rechargeable batteries.\nBut the quality of the controller is not good at all.\nThe left thumb stick on the Scorpio controller I received in the box was broken within a couple of days.\nIt felt like the internal part was damaged.\nIf you Google for \u0026ldquo;Xbox One controller left thumb stick cracking\u0026rdquo; you will find much more people facing this issue.\nLuckily the Microsoft Support is excellent and they replaced my controller by sending me a new normal controller and swapping the Xbox One X - Scorpio Controller with a new Scorpio controller.\nUnfortunately, this controller also had this issue and the 3rd controller is now on it\u0026rsquo;s way.\nSome controllers also have some sharp edges which feels cheap.\nSome other things you should know HDMI-CEC support doesn\u0026rsquo;t work well for me. When I started the PS4 Pro, my TV switched to the HDMI input.\nWith Xbox One X, this is limited to powering on my TV/receiver without switching to the HDMI input of the Xbox.\nLooks like I\u0026rsquo;m not the only one, as this HDMI-CEC Support Uservoice Request is backed by 4.000+ votes.\nAchievements are not working for me in Forza Motorsport 7.\nThe first achievement of \u0026ldquo;Welcome to Forza\u0026rdquo; is not even popping up where you need to complete your first race. I\u0026rsquo;ve completed and won several races already, but none of the achievements are popping up.\nMicrosoft Support asked me to fill in a form so they can take a look at it.\nLuckily achievements for other games are working fine. [UPDATE] This issue has been fixed. I\u0026rsquo;ve also seen games crashing or not starting up with error code \u0026ldquo;0x8027025a\u0026rdquo;.\nAfter restarting the game, it will work.\nAnd if you shutdown the Xbox while you have a game like NBA 2K18 open, it will open up next time saying the connection was lost and you need to restart the game.\nIf you shutdown the Xbox in F1 2017 while in the garage, next time you start the game all your telemetry is broken.\nSo now I\u0026rsquo;m just closing all games before shutting down the Xbox.\nAll this doesn\u0026rsquo;t feel right if you buy a console to play games on a more reliable system than a PC.\nFinal notes After all I\u0026rsquo;m still happy with the switch to the Xbox One X. The graphics are great, the system is much more silent and powerful and I like the controller lay-out. I was really surprised to see that the Xbox app works so well on an old and slow Intel NUC. Let\u0026rsquo;s hope the Xbox platform will be more reliable soon.\n","permalink":"https://devsecninja.com/2017/12/31/xbox-one-x-switching-from-the-ps4-pro/","summary":"\u003cp\u003e**As an Xbox 360 fanatic, I switched to PC gaming in 2011. After moving to a new house in 2016, there was no room for Triple Screen PC Gaming. I wanted a new and stable platform with 4K capabilities.\u003c/p\u003e\n\u003cp\u003eThe PS4 Pro released with 4K support and I wanted to see what they had to offer.\u003c/p\u003e\n\u003cp\u003eDuring that time, rumors were telling me that the Xbox Scorpio wasn\u0026rsquo;t going to release soon.\u003c/p\u003e","title":"Xbox One X - Switching from the PS4 Pro"},{"content":"With Windows Servicing, Microsoft is forcing consumers and businesses to upgrade to a Windows 10 Build twice a year.\nTheoretically you could go for one build per year, but that forces you to upgrade to a new build within 6 months.\nOtherwise you will end up without support for the old build.\nThis introduces quite some issues within both SMBs and large organizations.\nRecently a friend asked me about a recent printer that stopped working.\nThe printer was 2 months old and from a large vendor. I directly checked the build of the machine and yes, it was recently upgraded to the Fall Creators Update.\nThe printer was identified as an \u0026ldquo;Unknown USB Device\u0026rdquo;.\nUpdating the driver of the printer didn\u0026rsquo;t help.\nLuckily the Technical Support was responding quickly to help, but this means manual processing of orders for the next couple of weeks.\nYes I can revert the machine back to the old build, but will that fix the issue or create more issues?\nAnd because it\u0026rsquo;s not a Windows 10 Enterprise machine, Microsoft will try to update the machine later on.\nBuilds need to be more stable and aligned with 3rd Party vendors 3rd Party Vendors cannot keep up with the fast pace of releasing updates and drivers for Windows 10 Builds. The power of Windows is that it works with (nearly) everything. If those vendors cannot keep up with Windows, Microsoft needs to do something about it. Fix those release issues first before you set such high expectations for 3rd Party vendors. Going from a new build every 3+ years to twice a year is a big step. Therefore\u0026hellip;\nRelease builds yearly for enterprises (Large) enterprises are also struggling with Windows Servicing. For the same reasons as mentioned above: you cannot upgrade to a new build when 3rd Party vendors do not have their updates, drivers and software ready for the new build.\nMicrosoft wants organizations to have large test groups where new builds can first be tested.\nWhen builds are not stable, it will jeopardize the daily business and cost money when people are unable to do their job.\nOrganizations are also struggling to find people that have time, technical knowledge and most importantly want to be a tester.\nPick one Microsoft, please. Don\u0026rsquo;t get me wrong here. I like that Microsoft is releasing smaller updates in a faster pace, but most organizations and vendors are not ready for this big change. Tell me your opinion on the new Windows Servicing model in the comments section below.\n","permalink":"https://devsecninja.com/2017/12/28/stable-windows-builds-or-yearly-releases/","summary":"\u003cp\u003eWith Windows Servicing, Microsoft is forcing consumers and businesses to upgrade to a Windows 10 Build twice a year.\u003c/p\u003e\n\u003cp\u003eTheoretically you could go for one build per year, but that forces you to upgrade to a new build within 6 months.\u003c/p\u003e\n\u003cp\u003eOtherwise you will end up without support for the old build.\u003c/p\u003e\n\u003cp\u003eThis introduces quite some issues within both SMBs and large organizations.\u003c/p\u003e\n\u003cp\u003eRecently a friend asked me about a recent printer that stopped working.\u003c/p\u003e","title":"Stable Windows Builds or Yearly Releases"},{"content":"Lost track of the service channel naming of Windows and Office Servicing? Is it \u0026ldquo;Current Branch\u0026rdquo; or \u0026ldquo;Semi-Annual Channel\u0026rdquo; now?! Or Standard Release?!\nWindows 10: Ready: Semi-Annual Channel (Targeted) Ready for Business: Semi-Annual Channel\nOffice 365: Ready: Semi-Annual Channel (Targeted) (Or [Targeted Release](http://Ready: Semi-Annual Channel (Targeted) Ready for Business: Semi-Annual Channel)) Ready for Business: Semi-Annual Channel (Or [Standard Release](http://Ready: Semi-Annual Channel (Targeted) Ready for Business: Semi-Annual Channel))\nLast update: recently\u0026hellip; :) Every day is a new day to change these again, so stay tuned!\n","permalink":"https://devsecninja.com/2017/12/18/latest-naming-for-windows/office-servicing-channels/","summary":"\u003cp\u003eLost track of the service channel naming of Windows and Office Servicing? Is it \u0026ldquo;Current Branch\u0026rdquo; or \u0026ldquo;Semi-Annual Channel\u0026rdquo; now?! Or Standard Release?!\u003c/p\u003e\n\u003cblockquote\u003e\n\u003ch3 id=\"windows-10\"\u003eWindows 10:\u003c/h3\u003e\n\u003cp\u003eReady: Semi-Annual Channel (Targeted) Ready for Business: Semi-Annual Channel\u003c/p\u003e\n\u003ch3 id=\"office-365\"\u003eOffice 365:\u003c/h3\u003e\n\u003cp\u003eReady: Semi-Annual Channel (Targeted) (Or [Targeted Release](http://Ready: Semi-Annual Channel (Targeted) Ready for Business: Semi-Annual Channel)) Ready for Business: Semi-Annual Channel (Or [Standard Release](http://Ready: Semi-Annual Channel (Targeted) Ready for Business: Semi-Annual Channel))\u003c/p\u003e\n\u003c/blockquote\u003e\n\u003cp\u003e\u003cstrong\u003eLast update\u003c/strong\u003e: recently\u0026hellip; :) Every day is a new day to change these again, so stay tuned!\u003c/p\u003e","title":"Latest naming for Windows/Office Servicing channels"},{"content":"So you are signing your PowerShell scripts as a Best Practice from Microsoft. Good job! You\u0026rsquo;ve configured the PowerShell Execution Policy as AllSigned and you\u0026rsquo;ve created an application in SCCM where you run the signed script as:\nPowerShell.exe -File .\\Script.ps1\nThe application installs just fine on your machine from the Software Center. During the Task Sequence, the application cannot be installed and in the Event Viewer. You\u0026rsquo;ll find the following error message:\nPowerShell.exe: File cannot be loaded because running scripts is disabled on this system. For more information, see about_execution_policies at\u0026hellip;\u0026quot;\nYou open up PowerShell to see the current ExecutionPolicy. \u0026ldquo;Get-ExecutionPolicy -List\u0026rdquo; shows that all scopes have undefined execution policies. With \u0026ldquo;Get-Help about_Execution_Policies\u0026rdquo; you find out that Undefined policy is equal to a restricted policy and that \u0026ldquo;Permits individual commands, but will not run scripts\u0026rdquo;.\nThe solution Go back to your application in SCCM and make sure you set the ExecutionPolicy to AllSigned so it will work both during a Task Sequence and while working in OS.\nPowerShell.exe -ExecutionPolicy AllSigned -File .\\Script.ps1\nCheers!\n","permalink":"https://devsecninja.com/2017/12/06/powershell-signed-scripts-cannot-be-loaded-because-running-scripts-is-disabled/","summary":"\u003cp\u003eSo you are signing your PowerShell scripts as a Best Practice from Microsoft. Good job! You\u0026rsquo;ve configured the PowerShell Execution Policy as AllSigned and you\u0026rsquo;ve created an application in SCCM where you run the signed script as:\u003c/p\u003e\n\u003cblockquote\u003e\n\u003cp\u003ePowerShell.exe -File .\\Script.ps1\u003c/p\u003e\n\u003c/blockquote\u003e\n\u003cp\u003eThe application installs just fine on your machine from the Software Center. During the Task Sequence, the application cannot be installed and in the Event Viewer. You\u0026rsquo;ll find the following error message:\u003c/p\u003e","title":"PowerShell - Signed scripts \"cannot be loaded because running scripts is disabled\""},{"content":"Frequent visitors of my blog may have noticed that the domain name of the blog has changed from jvrtech.net to jvr.cloud. You can still reach my blog on jvrtech.net, but within a couple of years, that redirect may disappear.\n.Cloud TLD Most of my short nicknames or my full name aren\u0026rsquo;t available anymore on the TLDs like .com, .net or .org. When the .cloud TLD was introduced, I saw an opportunity to buy this very short domain name. Tech in a domain name tells people that it has something to do with technology which I like, but Cloud will hopefully do that too. Thank you for visiting my blog! Cheers, Jean-Paul\n","permalink":"https://devsecninja.com/2017/12/02/welcome-to-jvr.cloud/","summary":"\u003cp\u003eFrequent visitors of my blog may have noticed that the domain name of the blog has changed from jvrtech.net to jvr.cloud. You can still reach my blog on jvrtech.net, but within a couple of years, that redirect may disappear.\u003c/p\u003e\n\u003ch2 id=\"cloud-tld\"\u003e.Cloud TLD\u003c/h2\u003e\n\u003cp\u003eMost of my short nicknames or my full name aren\u0026rsquo;t available anymore on the TLDs like .com, .net or .org. When the .cloud TLD was introduced, I saw an opportunity to buy this very short domain name. Tech in a domain name tells people that it has something to do with technology which I like, but Cloud will hopefully do that too. Thank you for visiting my blog! Cheers, Jean-Paul\u003c/p\u003e","title":"Welcome to JVR.Cloud!"},{"content":"**This article describes the licensing options you have when you want to deploy Windows Server Virtual Machines in Azure. It\u0026rsquo;s getting complicated when you start using the Hybrid Use Benefit solution, so always contact Microsoft or your licensing supplier. **Please note that I will not answer any licensing questions.\nBuilt-in Licensing for Windows Server This type of licensing is by-far the most easy to use but it can be an expensive solution. You deploy an Azure Virtual Machine from the Portal or PowerShell and the licensing costs are automatically included with the Virtual Machine costs. But what if you want to use your existing KMS licenses which you\u0026rsquo;ve bought with your Enterprise Agreement? Or you want to use Windows Server Standard licenses instead of Datacenter licenses?\nHybrid Use Benefit The Hybrid Use Benefit is a \u0026rsquo;new\u0026rsquo; way of licensing. Microsoft will not activate your machine and you need to use your own activation method. That can be a KMS Server or a MAK-key. You also need to set the Hybrid Use Benefit flag from the Azure Portal or in your JSON template: \u0026ldquo;licenseType\u0026rdquo;: \u0026ldquo;Windows_Server\u0026rdquo;. If you forget to set this configuration, you have to recreate the Virtual Machine. As of today, there is no way to change this value without recreating the Virtual Machine.\nUsing your own MAK-key, KMS Server or AD based activation The best way to enforce the MAK-key or KMS Client key from the organization on a large scale is to use PowerShell DSC. PowerShell DSC will check every couple of minutes to see if the server is activated. If not, it will replace the key and try to reactivate. Use the script below as an example for your DSC Configuration.\nNeed Windows Server Standard? Most of the traditional Windows Server Standard licenses are bought with an Enterprise Agreement (EA) as a package deal with Microsoft. If you want to use those licenses in Azure, you have to create and maintain your own Windows Server image and upload it to Azure. As of today, you cannot grab a Windows Server Standard VM from the Azure Marketplace and deploy that.\nWhat Microsoft needs to improve: Please allow organizations to use Windows Server Standard images so there is no need to maintain own custom images Allow organizations to convert there current Windows Server licenses to Azure licenses more easily Activate Windows Server Standard licenses with Azure KMS which removes the need to host a KMS server DSC Configuration for Windows Server Activation in Azure Script ActivateWindows { DependsOn = @(\u0026#34;[xDSCDomainjoin]JoinDomain\u0026#34;) SetScript = { # Get current OS version $OsVersion = (Get-WmiObject -class Win32_OperatingSystem).Caption # Add KMS client key for server version to variable $Key switch -Regex ($OsVersion) { \u0026#39;Windows Server 2012 R2 Standard\u0026#39; { $Key = \u0026#39;D2N9P-3P6X9-2R39C-7RTCD-MDVJX\u0026#39;; break } \u0026#39;Windows Server 2016 Standard\u0026#39; { $Key = \u0026#39;WC2BQ-8NRM3-FDDYY-2BFGV-KHKQY\u0026#39;; break } \u0026#39;Windows Server 2012 Server Standard\u0026#39; { $Key = \u0026#39;XC9B7-NBPP2-83J2H-RHMBY-92BT4\u0026#39;; break } \u0026#39;Windows Server 2008 R2 Standard\u0026#39; { $Key = \u0026#39;YC6KT-GKW9T-YTKYR-T4X34-R7VHC\u0026#39;; break } } # Gather KMS Service info $KmsService = Get-WMIObject -query \u0026#34;select * from SoftwareLicensingService\u0026#34; # Set KMS Server $null = $KmsService.SetKeyManagementServiceMachine(\u0026#39;YourLicenseServer.contoso.com\u0026#39;) # Install the KMS Client Key $null = $KmsService.InstallProductKey($Key) # Activate Windows $null = $KmsService.RefreshLicenseStatus() } TestScript = { function Get-ActivationStatus { [CmdletBinding()] param( [Parameter(ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)] [string]$DNSHostName = $Env:COMPUTERNAME ) process { try { $wpa = Get-WmiObject SoftwareLicensingProduct -ComputerName $DNSHostName ` -Filter \u0026#34;ApplicationID = \u0026#39;55c92734-d682-4d71-983e-d6ec3f16059f\u0026#39;\u0026#34; ` -Property LicenseStatus -ErrorAction Stop } catch { $status = New-Object ComponentModel.Win32Exception ($_.Exception.ErrorCode) $wpa = $null } $out = New-Object psobject -Property @{ ComputerName = $DNSHostName; Status = [string]::Empty; } if ($wpa) { :outer foreach ($item in $wpa) { switch ($item.LicenseStatus) { 0 { $out.Status = \u0026#34;Unlicensed\u0026#34; } 1 { $out.Status = \u0026#34;Licensed\u0026#34;; break outer } 2 { $out.Status = \u0026#34;Out-Of-Box Grace Period\u0026#34;; break outer } 3 { $out.Status = \u0026#34;Out-Of-Tolerance Grace Period\u0026#34;; break outer } 4 { $out.Status = \u0026#34;Non-Genuine Grace Period\u0026#34;; break outer } 5 { $out.Status = \u0026#34;Notification\u0026#34;; break outer } 6 { $out.Status = \u0026#34;Extended Grace\u0026#34;; break outer } default { $out.Status = \u0026#34;Unknown value\u0026#34; } } } } else { $out.Status = $status.Message } $out } } If ((Get-ActivationStatus).Status -eq \u0026#34;Licensed\u0026#34;) { return $True } Else { return $False } } GetScript = { function Get-ActivationStatus { [CmdletBinding()] param( [Parameter(ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)] [string]$DNSHostName = $Env:COMPUTERNAME ) process { try { $wpa = Get-WmiObject SoftwareLicensingProduct -ComputerName $DNSHostName ` -Filter \u0026#34;ApplicationID = \u0026#39;55c92734-d682-4d71-983e-d6ec3f16059f\u0026#39;\u0026#34; ` -Property LicenseStatus -ErrorAction Stop } catch { $status = New-Object ComponentModel.Win32Exception ($_.Exception.ErrorCode) $wpa = $null } $out = New-Object psobject -Property @{ ComputerName = $DNSHostName; Status = [string]::Empty; } if ($wpa) { :outer foreach ($item in $wpa) { switch ($item.LicenseStatus) { 0 { $out.Status = \u0026#34;Unlicensed\u0026#34; } 1 { $out.Status = \u0026#34;Licensed\u0026#34;; break outer } 2 { $out.Status = \u0026#34;Out-Of-Box Grace Period\u0026#34;; break outer } 3 { $out.Status = \u0026#34;Out-Of-Tolerance Grace Period\u0026#34;; break outer } 4 { $out.Status = \u0026#34;Non-Genuine Grace Period\u0026#34;; break outer } 5 { $out.Status = \u0026#34;Notification\u0026#34;; break outer } 6 { $out.Status = \u0026#34;Extended Grace\u0026#34;; break outer } default { $out.Status = \u0026#34;Unknown value\u0026#34; } } } } else { $out.Status = $status.Message } $out } } @{ Result = ((Get-ActivationStatus).Status) } } } ","permalink":"https://devsecninja.com/2017/12/02/azure-windows-server-licensing-explained/","summary":"\u003cp\u003e**This article describes the licensing options you have when you want to deploy Windows Server Virtual Machines in Azure. It\u0026rsquo;s getting complicated when you start using the Hybrid Use Benefit solution, so always contact Microsoft or your licensing supplier. **\u003cstrong\u003ePlease note that I will not answer any licensing questions.\u003c/strong\u003e\u003c/p\u003e\n\u003ch2 id=\"built-in-licensing-for-windows-server\"\u003eBuilt-in Licensing for Windows Server\u003c/h2\u003e\n\u003cp\u003eThis type of licensing is by-far the most easy to use but it can be an expensive solution. You deploy an Azure Virtual Machine from the Portal or PowerShell and the licensing costs are automatically included with the Virtual Machine costs. But what if you want to use your existing KMS licenses which you\u0026rsquo;ve bought with your Enterprise Agreement? Or you want to use Windows Server Standard licenses instead of Datacenter licenses?\u003c/p\u003e","title":"Azure - Windows Server Licensing Explained"},{"content":"**With the new Windows 10 Fall Creators Update, Microsoft finally added a built-in NAT Switch into Hyper-V!\nThis gives Hyper-V Virtual Machines access to the computer\u0026rsquo;s network. **The new switch automatically assigns IP address to your Virtual Machines, so no need to run your own DHCP server anymore! In older versions of Windows 10, it was still required to create the Virtual Switch yourself, but this required static IP address assignment in the OS or the installation of a DHCP server.\nNot the most elegant option.\nThe switch is named \u0026ldquo;Default Switch\u0026rdquo; and cannot be changed in the Hyper-V Virtual Switch Manager:\nThe Default Switch Virtual Network in the Hyper-V Virtual Switch Manager According to the info message: \u0026ldquo;The Default Network switch automatically gives virtual machines access to the computer\u0026rsquo;s network using NAT (network address translation).\u0026rdquo; I\u0026rsquo;m happy that Microsoft finally introduced this as it was already available in other 3rd Party solutions and a good argument why some people didn\u0026rsquo;t want to migrate to Hyper-V.\nNow they can! I wasn\u0026rsquo;t able to find an official statement of Microsoft on this new feature, but I\u0026rsquo;m sure it will be published soon.\nWhat do you think of this new feature?\nAre you going to migrate from VMware or other solutions to Hyper-V?\nLet me know in the comments section!\nCheers, Jean-Paul\n","permalink":"https://devsecninja.com/2017/10/20/nat-switch-now-built-into-hyper-v-windows-10-fall-creators-update/","summary":"\u003cp\u003e**With the new Windows 10 Fall Creators Update, Microsoft finally added a built-in NAT Switch into Hyper-V!\u003c/p\u003e\n\u003cp\u003eThis gives Hyper-V Virtual Machines access to the computer\u0026rsquo;s network. **\u003cstrong\u003eThe new switch automatically assigns IP address to your Virtual Machines, so no need to run your own DHCP server anymore!\u003c/strong\u003e In older versions of Windows 10, \u003ca href=\"http://cloudenius.com/2017/09/24/create-a-hyper-v-nat-switch-with-powershell-the-easy-way/\"\u003eit was still required to create the Virtual Switch yourself\u003c/a\u003e, but this required static IP address assignment in the OS or the installation of a DHCP server.\u003c/p\u003e","title":"NAT Switch now built into Hyper-V! - Windows 10 Fall Creators Update"},{"content":"So your Group Policy (GPO) settings do not allow you to upgrade to the Windows 10 Fall Creators Update and you have local administrative access on your machine?\nThe registry fix from below will change this!\nCopy the registry fix from below and save it as fix.reg with Notepad. (Make sure you don\u0026rsquo;t save it as fix.reg.txt!) Right click on the file and click \u0026ldquo;Merge\u0026rdquo;.\nYou should now have access to Settings -\u0026gt; Update \u0026amp; Security -\u0026gt; Windows Insider Program.\nEnroll your device in the program (with your Microsoft account!) and select \u0026ldquo;Just fixes, apps and drivers\u0026rdquo; from the dropdown - which will enroll you in the Release Preview Ring.\nGo to Settings -\u0026gt; Update \u0026amp; Security -\u0026gt; Windows Updates and select \u0026ldquo;Check online for updates from Microsoft Update\u0026rdquo;.\nIt will take some time before the Fall Creators Update pops up here.\nWhen the Windows Insider Settings are greyed out again after several minutes, your GPO settings were re-applied and you need to rerun the fix.reg file.\nRun the fix.reg file every hour or so and check again for Windows Updates.\nAfter a couple of hours you should be able to enjoy the Fall Creators Update!\nRegistry Fix: Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\\SOFTWARE\\Microsoft\\WindowsSelfHost\\Applicability] \u0026quot;EnablePreviewBuilds\u0026quot;=dword:00000002 \u0026quot;IsBuildFlightingEnabled\u0026quot;=dword:00000001 \u0026quot;IsConfigExpFlightingEnabled\u0026quot;=dword:00000001 \u0026quot;IsConfigSettingsFlightingEnabled\u0026quot;=dword:00000001 \u0026quot;SuspensionStartTime\u0026quot;=- \u0026quot;SuspensionEndTime\u0026quot;=- [HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\PreviewBuilds] \u0026quot;AllowBuildPreview\u0026quot;=dword:00000001 \u0026quot;EnableConfigFlighting\u0026quot;=dword:00000001 [HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\DataCollection] \u0026quot;CommercialId\u0026quot;=-\n","permalink":"https://devsecninja.com/2017/10/14/install-the-windows-10-fall-creators-update-on-your-gpo-enabled-machine/","summary":"\u003cp\u003eSo your Group Policy (GPO) settings do not allow you to upgrade to the Windows 10 Fall Creators Update and you have local administrative access on your machine?\u003c/p\u003e\n\u003cp\u003eThe registry fix from below will change this!\u003c/p\u003e\n\u003cp\u003eCopy the registry fix from below and save it as fix.reg with Notepad. (Make sure you don\u0026rsquo;t save it as fix.reg.txt!) Right click on the file and click \u0026ldquo;Merge\u0026rdquo;.\u003c/p\u003e\n\u003cp\u003eYou should now have access to Settings -\u0026gt; Update \u0026amp; Security -\u0026gt; Windows Insider Program.\u003c/p\u003e","title":"Install the Windows 10 Fall Creators Update on your GPO-enabled machine"},{"content":"You can follow the original guide by Microsoft and manually edit all the details, or just use the variables from the script below and let PowerShell do the work for you.\n# Variables $InternalSwitchName = \u0026#34;Internal Virtual Switch\u0026#34; $NATGatewayPrefixLength = \u0026#34;24\u0026#34; $NATGatewayNetwork = \u0026#34;192.168.0.0/$NATGatewayPrefixLength\u0026#34; $NATGatewayIP = \u0026#34;192.168.0.1\u0026#34; $NATNetworkName = \u0026#34;NAT Network\u0026#34; # Create the VM Switch and NAT Gateway New-VMSwitch -SwitchName $InternalSwitchName -SwitchType Internal New-NetIPAddress -IPAddress $NATGatewayIP -PrefixLength $NATGatewayPrefixLength -InterfaceIndex (Get-NetAdapter -Name $(\u0026#34;vEthernet ($InternalSwitchName)\u0026#34;)).InterfaceIndex New-NetNat -Name $NATNetworkName -InternalIPInterfaceAddressPrefix $NATGatewayNetwork ","permalink":"https://devsecninja.com/2017/09/24/create-a-hyper-v-nat-switch-with-powershell-the-easy-way/","summary":"\u003cp\u003eYou can follow the \u003ca href=\"https://docs.microsoft.com/en-us/virtualization/hyper-v-on-windows/user-guide/setup-nat-network\"\u003eoriginal guide by Microsoft\u003c/a\u003e and manually edit all the details, or just use the variables from the script below and let PowerShell do the work for you.\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" class=\"chroma\"\u003e\u003ccode class=\"language-powershell\" data-lang=\"powershell\"\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"c\"\u003e# Variables\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nv\"\u003e$InternalSwitchName\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Internal Virtual Switch\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nv\"\u003e$NATGatewayPrefixLength\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;24\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nv\"\u003e$NATGatewayNetwork\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;192.168.0.0/\u003c/span\u003e\u003cspan class=\"nv\"\u003e$NATGatewayPrefixLength\u003c/span\u003e\u003cspan class=\"s2\"\u003e\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nv\"\u003e$NATGatewayIP\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;192.168.0.1\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nv\"\u003e$NATNetworkName\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;NAT Network\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"c\"\u003e# Create the VM Switch and NAT Gateway\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nb\"\u003eNew-VMSwitch\u003c/span\u003e \u003cspan class=\"n\"\u003e-SwitchName\u003c/span\u003e \u003cspan class=\"nv\"\u003e$InternalSwitchName\u003c/span\u003e \u003cspan class=\"n\"\u003e-SwitchType\u003c/span\u003e \u003cspan class=\"n\"\u003eInternal\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nb\"\u003eNew-NetIPAddress\u003c/span\u003e \u003cspan class=\"n\"\u003e-IPAddress\u003c/span\u003e \u003cspan class=\"nv\"\u003e$NATGatewayIP\u003c/span\u003e \u003cspan class=\"n\"\u003e-PrefixLength\u003c/span\u003e \u003cspan class=\"nv\"\u003e$NATGatewayPrefixLength\u003c/span\u003e \u003cspan class=\"n\"\u003e-InterfaceIndex\u003c/span\u003e \u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"nb\"\u003eGet-NetAdapter\u003c/span\u003e \u003cspan class=\"n\"\u003e-Name\u003c/span\u003e \u003cspan class=\"vm\"\u003e$\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"s2\"\u003e\u0026#34;vEthernet (\u003c/span\u003e\u003cspan class=\"nv\"\u003e$InternalSwitchName\u003c/span\u003e\u003cspan class=\"s2\"\u003e)\u0026#34;\u003c/span\u003e\u003cspan class=\"p\"\u003e)).\u003c/span\u003e\u003cspan class=\"py\"\u003eInterfaceIndex\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nb\"\u003eNew-NetNat\u003c/span\u003e \u003cspan class=\"n\"\u003e-Name\u003c/span\u003e \u003cspan class=\"nv\"\u003e$NATNetworkName\u003c/span\u003e \u003cspan class=\"n\"\u003e-InternalIPInterfaceAddressPrefix\u003c/span\u003e \u003cspan class=\"nv\"\u003e$NATGatewayNetwork\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e","title":"Create a Hyper-V NAT Switch with PowerShell - the easy way"},{"content":"Six months ago I received an email from our IT Department. Good news, my old 3.5 KG Dell Latitiude E6540 (with a big battery) was out of warranty. The Surface Pro wasn\u0026rsquo;t announced yet but because of the rumors, I didn\u0026rsquo;t want to go with a soon-to-be-old Surface Pro 4. And I must say I wanted a notebook that I can place on my Bobby Notebook Stand.\nPicture Source: Ergo2Go.nl I also didn\u0026rsquo;t want the standard models like the E7270 or E7470 with i5 and Full-HD. Because I sometimes need to run Hyper-V Labs at customers, I wanted a High Performance machine. I took the Dell Precision 5510 with the following specs:\nIntel i7-6820HQ CPU 15.6 4K Touch Screen 16 GB memory which is expandable to 32 GB NVIDIA Quadro M1000M 512 GB SSD The 4K screen is absolutely gorgeous! Windows 10 scales much better in 4K than before and works great with Server 2016 in RDP.\nIf you RDP a lot to older Operating Systems, I can recommend to scale back to Full HD. I also recommend not to sit in full sunlight because of the glare.\nThe device is absolutely silent in idle.\nIsn\u0026rsquo;t that always the case when a device is in idle?!\nLet me tell you that I\u0026rsquo;ve worked with several devices from different vendors and it\u0026rsquo;s not.\nOf course you will hear the fans when you spin up a Hyper-V Lab but it\u0026rsquo;s still not bad.\nThe case itself with the thin bezels, the aluminium design and the big touchpad is fantastic.\nThe gestures from Windows 10 are working smooth and fast with the touchpad and the keyboard is solid.\nSo after six months I\u0026rsquo;m still happy with the Dell Precision 5510.\nIs there a device where I want to trade it for?!\nYes, the Surface Book of course. :)\nLet me know what you think of the Precision 5510! Cheers.\n","permalink":"https://devsecninja.com/2017/09/23/dell-precision-5510-six-months-later/","summary":"\u003cp\u003eSix months ago I received an email from our IT Department. Good news, my old 3.5 KG Dell Latitiude E6540 (with a big battery) was out of warranty. The Surface Pro wasn\u0026rsquo;t announced yet but because of the rumors, I didn\u0026rsquo;t want to go with a soon-to-be-old Surface Pro 4. And I must say I wanted a notebook that I can place on my Bobby Notebook Stand.\u003c/p\u003e\n\u003cp\u003e\u003cimg alt=\"Bobby Notebook Stand\" loading=\"lazy\" src=\"/images/2017/09/90-095-2_11-e1506154544466.jpg\"\u003e\u003c/p\u003e","title":"Dell Precision 5510: Six Months Later"},{"content":"A couple of months ago I started to work on a very intensive project with very tight deadlines.\nThis resulted in some very long days at the office and planned rest in the weekends.\nDuring that time I decided to disable the corporate email synchronization on my corporate phone. I always got triggered when I received a new email regarding project activities.\nIt\u0026rsquo;s something you recognize when it\u0026rsquo;s not there anymore.\nMy adrenaline level raised and my brain stayed active over the weekend.\nQuestions like \u0026ldquo;How can I do task X in the shortest amount of time?\u0026rdquo; and \u0026ldquo;What can we do better?\u0026rdquo; keep popping up, even during the weekends. I disabled the email synchronization on my phone and suddenly I found more rest over the weekend which resulted in being more energized in the office.\nDuring that time I used Outlook Web Access from the webbrowser on my phone.\nBut sometimes you need to quickly find an email and doing the whole authentication with MFA again can take too long.\nSo I decided to turn email synchronization back on and I disabled the notifications on both my phone and Watch instead. I even disabled that counter on top of the email app.\nDuring the weekends, I just open the personal email view and I get nearly the same results.\nIt\u0026rsquo;s still more attractive to check your corporate email over the weekend, so I would still recommend to completely disable the synchronization.\nJust try it for a month.\nLet me know what you think!\nEnjoy a more restful weekend. :)\n","permalink":"https://devsecninja.com/2017/09/16/restful-weekends-disable-corporate-email-on-your-phone/","summary":"\u003cp\u003eA couple of months ago I started to work on a very intensive project with very tight deadlines.\u003c/p\u003e\n\u003cp\u003eThis resulted in some very long days at the office and planned rest in the weekends.\u003c/p\u003e\n\u003cp\u003eDuring that time I decided to disable the corporate email synchronization on my corporate phone. I always got triggered when I received a new email regarding project activities.\u003c/p\u003e\n\u003cp\u003eIt\u0026rsquo;s something you recognize when it\u0026rsquo;s not there anymore.\u003c/p\u003e","title":"Restful Weekends - Disable Corporate Email On Your Phone"},{"content":"The following unknown device IDs will pop-up when you run the script or when you open Device Manager:\nROOT\\VMBUS\\0000 ROOT\\VID\\0000 ROOT\\VPCIVSP\\0000 ROOT\\STORVSP\\0000 ROOT\\SYNTH3DVSP\\0000\nIf you want to find all Unknown Devices, open PowerShell as an Administrator and run:\nGet-WmiObject Win32_PNPEntity | Where-Object { $_.ConfigManagerErrorCode -ne 0} | Select DeviceID On my work notebook, all drivers were correctly populated so it had to be something with my test laptop. It\u0026rsquo;s a fresh Windows 10 machine deployed by a Task Sequence - enabled with Device Guard and Credential Guard.\nSolution: During the installation I\u0026rsquo;ve installed the Microsoft-Hyper-V-Hypervisor feature on Windows 10. You also need to install the Microsoft-Hyper-V-Services if you want to have those drivers installed as well.\n","permalink":"https://devsecninja.com/2017/09/11/unknown-devices-when-installing-hyper-v-on-windows-10/","summary":"\u003cp\u003eThe following unknown device IDs will pop-up when you run the script or when you open Device Manager:\u003c/p\u003e\n\u003cblockquote\u003e\n\u003cp\u003eROOT\\VMBUS\\0000 ROOT\\VID\\0000 ROOT\\VPCIVSP\\0000 ROOT\\STORVSP\\0000 ROOT\\SYNTH3DVSP\\0000\u003c/p\u003e\n\u003c/blockquote\u003e\n\u003cp\u003eIf you want to find all Unknown Devices, open PowerShell as an Administrator and run:\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" class=\"chroma\"\u003e\u003ccode class=\"language-powershell\" data-lang=\"powershell\"\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nb\"\u003eGet-WmiObject\u003c/span\u003e \u003cspan class=\"n\"\u003eWin32_PNPEntity\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e \u003cspan class=\"nb\"\u003eWhere-Object\u003c/span\u003e \u003cspan class=\"p\"\u003e{\u003c/span\u003e \u003cspan class=\"nv\"\u003e$_\u003c/span\u003e\u003cspan class=\"p\"\u003e.\u003c/span\u003e\u003cspan class=\"py\"\u003eConfigManagerErrorCode\u003c/span\u003e \u003cspan class=\"o\"\u003e-ne\u003c/span\u003e \u003cspan class=\"mf\"\u003e0\u003c/span\u003e\u003cspan class=\"p\"\u003e}\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e \u003cspan class=\"nb\"\u003eSelect \u003c/span\u003e\u003cspan class=\"n\"\u003eDeviceID\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003eOn my work notebook, all drivers were correctly populated so it had to be something with my test laptop. It\u0026rsquo;s a fresh Windows 10 machine deployed by a Task Sequence - enabled with Device Guard and Credential Guard.\u003c/p\u003e","title":"Unknown Devices when installing Hyper-V on Windows 10"},{"content":"Recently I was trying to apply a lock screen image with a GPO. I distributed the image to the C:/Windows/Web/Wallpaper directory and configured the Windows 10 GPO to that location.\nAfter running the Windows 10 Task Sequence successfully, the default lock screen image came up. I was using a large image from the client so that it still looks good on bigger screens. I\u0026rsquo;ve found out that after resizing the image back to 1080P, the image was applied successfully after locking the machine.\nLooks like a strange bug if you would ask me.\nCheers!\n","permalink":"https://devsecninja.com/2017/09/10/lock-screen-image-not-showing-windows-10-1703/","summary":"\u003cp\u003eRecently I was trying to apply a lock screen image with a GPO. I distributed the image to the C:/Windows/Web/Wallpaper directory and configured the Windows 10 GPO to that location.\u003c/p\u003e\n\u003cp\u003eAfter running the Windows 10 Task Sequence successfully, the default lock screen image came up. I was using a large image from the client so that it still looks good on bigger screens. I\u0026rsquo;ve found out that after resizing the image back to 1080P, the image was applied successfully after locking the machine.\u003c/p\u003e","title":"Lock screen image not showing - Windows 10 1703"},{"content":"Recently Microsoft introduced Windows Autopilot.\nThis is a feature where you can register your corporate devices and where users can use their internet connection to sign in with their Azure AD credentials.\nThe device is automatically enrolled with MDM like Intune and will receive apps and policies from there.\nAccording to Microsoft\u0026rsquo;s recent blog post and instruction video, a user needs to insert their WiFi password as the device will get the configuration from MDM and is already enrolled, without having the option to change the MDM provider or enroll the device as a personal device.\nThe device really becomes a corporate-owned device. This looks a bit like the Apple Device Enrollment Program. One of the interesting parts of that instruction video, is that it looks like OneDrive can be pre-configured from OOBE as well:\nI hope that Microsoft will further expand the possibilities of this service. What I would like to see is that the device can cache/download applications and settings from Intune during the factory imaging process. This ensures that applications, policies and settings are pre-loaded on a device and don\u0026rsquo;t need to be downloaded anymore. This will dramatically decrease network bandwidth and deployment time.\n","permalink":"https://devsecninja.com/2017/07/04/windows-autopilot-configure-onedrive-from-oobe/","summary":"\u003cp\u003e\u003cimg alt=\"Windows AutoPilot OneDrive\" loading=\"lazy\" src=\"/images/2017/07/windowsautopilotonedrive1.png\"\u003eRecently Microsoft introduced Windows Autopilot.\u003c/p\u003e\n\u003cp\u003eThis is a feature where you can register your corporate devices and where users can use their internet connection to sign in with their Azure AD credentials.\u003c/p\u003e\n\u003cp\u003eThe device is automatically enrolled with MDM like Intune and will receive apps and policies from there.\u003c/p\u003e\n\u003cp\u003eAccording to \u003ca href=\"https://blogs.technet.microsoft.com/ausoemteam/2017/07/04/coming-soon-to-a-pc-near-you-windows-autopilot/\"\u003eMicrosoft\u0026rsquo;s recent blog post\u003c/a\u003e and instruction video, a user needs to insert their WiFi password as the device will get the configuration from MDM and is already enrolled, without having the option to change the MDM provider or enroll the device as a personal device.\u003c/p\u003e","title":"Windows Autopilot - Configure OneDrive from OOBE?!"},{"content":"Today I found out that Azure AD Domain Services is available from the new Azure Portal! The documentation is still based on using the old portal. Now you can finally use Azure Resource Manager for the VNET and deployment. Creating your first Azure AD Domain Services instance will take quite some time but is really easy to configure. Specify the DNS name of the domain, a resource group, a VNET with subnet and a subscription and you\u0026rsquo;re good to go. Enjoy this feature in the new Portal!\nAwesome! Azure AD Domain Services is available from the (new) @Azure Portal! Enjoy! pic.twitter.com/a5XuDofmlj\n— Jean-Paul (@JPvR_NL) July 3, 2017\n","permalink":"https://devsecninja.com/2017/07/03/azure-ad-domain-services-now-available-from-the-azure-portal/","summary":"\u003cp\u003eToday I found out that Azure AD Domain Services is available from the new Azure Portal! The documentation is still based on using the old portal. Now you can finally use Azure Resource Manager for the VNET and deployment. Creating your first Azure AD Domain Services instance will take quite some time but is really easy to configure. Specify the DNS name of the domain, a resource group, a VNET with subnet and a subscription and you\u0026rsquo;re good to go. Enjoy this feature in the new Portal!\u003c/p\u003e","title":"Azure AD Domain Services now available from the Azure Portal!"},{"content":"Last Tuesday Avanade announced the new Avanade Azure Stack Solution.\nAvanade delivers this solution from client site, at remote locations or hosted in Avanade\u0026rsquo;s own datacenters.\nAzure Stack is an extension of Azure to on-premises locations.\nPeople tend to forget that Azure Stack is not just a replacement of your physical servers running a hypervisor like Hyper-V.\nIt\u0026rsquo;s a true hybrid cloud solution.\nYou get features like Disaster Recovery with instant fail-over, Platform as a Service (PaaS) capabilities, Load Balancing, the new Portal experience and so on. I\u0026rsquo;m really excited to tell you more about this great solution.\nWhat makes Azure Stack such an interesting solution?\nWith Azure Stack, you get consistency across public and private cloud. This means you can develop applications in Azure or Azure Stack with no code changes needed. When people understand Azure, they also understand Azure Stack. No extra knowledge needed. If you want to move from Infrastructure as a Service (IaaS) to Platform as a Service (PaaS) and use this in your own datacenter. Imagine you can just request an on-premises SQL database from a portal without having to install, configure and manage SQL Server! When you have locations without internet access or with high latency connections. When you have data that cannot be stored in the Azure Cloud and you still want to use the power of Azure. When you are a reseller that wants to deliver a rock solid platform to your users. When your company is not ready to move to the cloud. You can use Azure Stack as a stepping stone to Azure. When you are already using Azure Pack. You can connect Azure Stack to Azure Pack with the WAP Connector. You can start very small with a couple of nodes and slowly migrate to Azure Stack to spread costs. There is no need to completely replace a datacenter at once. This all sounds great. What\u0026rsquo;s the catch? I\u0026rsquo;m really wondering how fast Microsoft is able to close the gap between Azure and Azure Stack.\nNot all features in Azure are currently available in Azure Stack, so make sure that the features you want to use in Azure Stack are available during Global Availability (GA).\nSince last year I\u0026rsquo;ve been involved with our internal teams to test the platform and demonstrate this new technology to our Avanade Netherlands Infrastructure team.\nAt Avanade, we have a dedicated Azure Stack environment for demos or a PoC.\nTo learn more, send an email to azurestackpoc@avanade.com or contact me.\nDisclaimer: Opinions are my own and not the views of my employer.\n","permalink":"https://devsecninja.com/2017/06/12/avanade-announces-new-microsoft-azure-stack-solution/","summary":"\u003cp\u003eLast Tuesday \u003ca href=\"https://www.avanade.com/en/media-center/press-releases/azure-stack-solution\"\u003eAvanade announced\u003c/a\u003e the new Avanade Azure Stack Solution.\u003c/p\u003e\n\u003cp\u003eAvanade delivers this solution from client site, at remote locations or hosted in Avanade\u0026rsquo;s own datacenters.\u003c/p\u003e\n\u003cp\u003eAzure Stack is an extension of Azure to on-premises locations.\u003c/p\u003e\n\u003cp\u003ePeople tend to forget that Azure Stack is not just a replacement of your physical servers running a hypervisor like Hyper-V.\u003c/p\u003e\n\u003cp\u003eIt\u0026rsquo;s a true hybrid cloud solution.\u003c/p\u003e\n\u003cp\u003eYou get features like Disaster Recovery with instant fail-over, Platform as a Service (PaaS) capabilities, Load Balancing, the new Portal experience and so on. I\u0026rsquo;m really excited to tell you more about this great solution.\u003c/p\u003e","title":"Avanade announces new Microsoft Azure Stack solution"},{"content":"Would you buy a new television when only the CPU is slightly faster, but the screen quality is worse?\nWould you buy a new phone, when it\u0026rsquo;s only slightly faster than the old model?\nWell, that\u0026rsquo;s the position I\u0026rsquo;m currently in.\nLast year I wanted to buy a new iPad. I was looking for an iPad around the iPad Air 2 price range.\nIt\u0026rsquo;s just for home-use like Netflix, Spotify and HomeKit, so I\u0026rsquo;m absolutely not a Pro user. I didn\u0026rsquo;t want to buy a device that was released 2 years ago.\nYou know Apple is probably working on a refresh.\nAnd you want to keep the device as long as possible, so a refresh will give you a new update-cycle.\nSo you wait for that new model to get launched. A colleague was in the same boat just like me, so we waited for next year to come.\nMy expectations where not that high for a new iPad. I just wanted a solid iPad for a nice price. I was in shock when Apple announced and released the iPad 2017 in March.\nHow could you release a new product, which is actually worse than the previous model?\nThe \u0026rsquo;new\u0026rsquo; iPad 2017:\nhas a non-laminated screen (not easy to read with sunlight) is like € 30 cheaper in the Netherlands has the same cameras as the iPad Air 2 in 2014 is more bulky and heavier feels like the old iPad Air 1 I don\u0026rsquo;t get it Apple. My expectations where not that high. The iPad Air 2 with a better CPU and a better battery would have been fine to me and an instant-buy.\nNot a downgrade to the iPad Air with a new CPU in it.\nWhen I look from the positive side, it\u0026rsquo;s just a nice tablet - when you turn your blinds down and keep the device in a dock.\nBut not when you keep in mind that the Air 2 was released 3 years ago which was a better device, but a bit slower.\nObviously, Apple wants people like me to buy the Apple iPad Pro, which will cost me € 200,- more. I\u0026rsquo;m not going to do that. I think that a lot of people with 4th Gen iPads or iPad Air 1/2 owners are again waiting for a new device or buying the iPad Pro.\nWhile I\u0026rsquo;m still happy with my iPhone SE for work in combination with iCloud Photos and my Apple Watch, Apple lost me in this one.\nSorry for this non-Microsoft post, but I had to write this down because I think a lot of people are full of doubts too. I hope I was able to help someone with this post, because I couldn\u0026rsquo;t find critical posts.\n","permalink":"https://devsecninja.com/2017/06/11/why-im-not-buying-the-new-ipad-2017/","summary":"\u003cp\u003e\u003cimg alt=\"iPad 2017\" loading=\"lazy\" src=\"/images/2017/06/hero_availability_large_2x1-e1497157481817.jpg\"\u003eWould you buy a new television when only the CPU is slightly faster, but the screen quality is worse?\u003c/p\u003e\n\u003cp\u003eWould you buy a new phone, when it\u0026rsquo;s only slightly faster than the old model?\u003c/p\u003e\n\u003cp\u003eWell, that\u0026rsquo;s the position I\u0026rsquo;m currently in.\u003c/p\u003e\n\u003cp\u003eLast year I wanted to buy a new iPad. I was looking for an iPad around the iPad Air 2 price range.\u003c/p\u003e\n\u003cp\u003eIt\u0026rsquo;s just for home-use like Netflix, Spotify and HomeKit, so I\u0026rsquo;m absolutely not a Pro user. I didn\u0026rsquo;t want to buy a device that was released 2 years ago.\u003c/p\u003e","title":"Why I'm not buying the new iPad 2017"},{"content":"As I told you before in my previous blog post, I was asked to build an interactive PowerShell script for creating Virtual Machines in Azure. In this blog post, I want to show you how I\u0026rsquo;ve created a report (or array) within PowerShell that:\nVisualize the to-be-created objects to the user Allows PowerShell to get the data of that array to create Virtual Machines. This makes sure that you have a consistent view of what PowerShell will create for you. See it as an order overview before you buy something online. Let\u0026rsquo;s imagine I have all the Virtual Machines that I want to create in $VMs. This can be an import of a CSV, or maybe I\u0026rsquo;ve asked the user for details with \u0026ldquo;Read-Host\u0026rdquo; or Out-GridView.\nWhen you import a CSV, the content will already be organized in an array.\nBut with this code, you can easily add more content to it and combine both data from a CSV and from the script.\nFor example, a randomized password or the first available IP address in a subnet in Azure.\nMy end goal is to have a nice overview of all the needed Virtual Machines in $Report, which I can later use to make those VMs.\nWith the code below, you will create a PSObject for every VM in the $VMs variable.\nAfter the PSObject has been created, it will append to the $Report variable.\nWith \u0026ldquo;$Report = @()\u0026rdquo; you ask PowerShell to create an empty array.\nSee it as an empty table that you could later use to add content to it.\nAfter the deployment has succeeded or failed, you can add the status to $VM.DeploymentStatus.\n# Set report variable $Report = @() Foreach ($VM in $VMs) { $PSObject = New-Object PSObject -Property @{ DeploymentName = $VM.resourceGroupName + \u0026#34;-\u0026#34; + (Get-Date -Format \u0026#34;yyyyMMdd-hh-mm-ss\u0026#34;) VMName = $VM.vmName Location = $VM.resourceGroupLocation ResourceGroupName = $VM.resourceGroupName AdminPassword = $VM.adminPassword VMSize = $VM.vmSize VirtualNetwork = $VM.virtualNetwork VirtualNetworkRG = $VM.virtualNetwork.ResourceGroupName SubnetName = $VM.subnetName IPAddress = $VM.ipAddress OperatingSystem = $VM.operatingSystem DeploymentStatus = $Null } $Report += $PSObject } # Show the report $Report # Or show it in Table Format # $Report | Format-Table The above example is by far the easiest way to create a nice array for me. Thanks for reading. Hope you find it useful too.\n","permalink":"https://devsecninja.com/2017/06/10/powershell-how-to-create-an-array-with-psobject/","summary":"\u003cp\u003eAs I told you before in my \u003ca href=\"http://cloudenius.com/2017/06/04/powershell-using-out-gridview-to-select-a-parameter/\"\u003eprevious blog post\u003c/a\u003e, I was asked to build an interactive PowerShell script for creating Virtual Machines in Azure. In this blog post, I want to show you how I\u0026rsquo;ve created a report (or array) within PowerShell that:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eVisualize the to-be-created objects to the user\u003c/li\u003e\n\u003cli\u003eAllows PowerShell to get the data of that array to create Virtual Machines. This makes sure that you have a consistent view of what PowerShell will create for you. See it as an order overview before you buy something online.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eLet\u0026rsquo;s imagine I have all the Virtual Machines that I want to create in $VMs. This can be an import of a CSV, or maybe I\u0026rsquo;ve asked the user for details with \u0026ldquo;Read-Host\u0026rdquo; or \u003ca href=\"http://devsecninja.com/2017/06/04/powershell-using-out-gridview-to-select-a-parameter/\"\u003eOut-GridView\u003c/a\u003e.\u003c/p\u003e","title":"PowerShell - How to Create an Array with PSObject"},{"content":"Last week I was asked to build an interactive PowerShell script for creating Virtual Machines in Azure. In this blog post, I want to share an easy way to prompt a user for a selection.\n# Select Azure subscription $AzureSubscription = (Get-AzureRmSubscription | Out-GridView -Title \u0026#34;Choose your Azure subscription and click OK.\u0026#34; -PassThru) Write-Output \u0026#34;Switching to Azure subscription: $($AzureSubscription.Name)\u0026#34; $AzureSubscriptionInfo = Select-AzureRmSubscription -SubscriptionId $AzureSubscription.Id This uses Out-GridView to display the contents of the \u0026ldquo;Get-AzureRmSubscription\u0026rdquo; Cmdlet and asks the user to make a selection. The user is able to sort and filter the contents within the grid and the user will be informed of the decision by using \u0026ldquo;Write-Output\u0026rdquo;.\nLet\u0026rsquo;s say it\u0026rsquo;s not the most elegant way to ask a user to select a value because it\u0026rsquo;s a pop-up and because of the small \u0026ldquo;OK\u0026rdquo; and \u0026ldquo;Cancel\u0026rdquo; buttons, but this PowerShell script was developed for IT Administrators. The benefit is that it\u0026rsquo;s easy to use with out-of-the-box code, instead of using custom modules. That\u0026rsquo;s it for now, hope you find it useful. Cheers!\n","permalink":"https://devsecninja.com/2017/06/04/powershell-using-out-gridview-to-select-a-parameter/","summary":"\u003cp\u003eLast week I was asked to build an interactive PowerShell script for creating Virtual Machines in Azure. In this blog post, I want to share an easy way to prompt a user for a selection.\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" class=\"chroma\"\u003e\u003ccode class=\"language-powershell\" data-lang=\"powershell\"\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"c\"\u003e# Select Azure subscription\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nv\"\u003e$AzureSubscription\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"nb\"\u003eGet-AzureRmSubscription\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e \u003cspan class=\"nb\"\u003eOut-GridView\u003c/span\u003e \u003cspan class=\"n\"\u003e-Title\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Choose your Azure subscription and click OK.\u0026#34;\u003c/span\u003e \u003cspan class=\"n\"\u003e-PassThru\u003c/span\u003e\u003cspan class=\"p\"\u003e)\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nb\"\u003eWrite-Output\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Switching to Azure subscription: \u003c/span\u003e\u003cspan class=\"p\"\u003e$(\u003c/span\u003e\u003cspan class=\"nv\"\u003e$AzureSubscription\u003c/span\u003e\u003cspan class=\"p\"\u003e.\u003c/span\u003e\u003cspan class=\"n\"\u003eName\u003c/span\u003e\u003cspan class=\"p\"\u003e)\u003c/span\u003e\u003cspan class=\"s2\"\u003e\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nv\"\u003e$AzureSubscriptionInfo\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"nb\"\u003eSelect-AzureRmSubscription\u003c/span\u003e \u003cspan class=\"n\"\u003e-SubscriptionId\u003c/span\u003e \u003cspan class=\"nv\"\u003e$AzureSubscription\u003c/span\u003e\u003cspan class=\"p\"\u003e.\u003c/span\u003e\u003cspan class=\"py\"\u003eId\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003eThis uses Out-GridView to display the contents of the \u0026ldquo;Get-AzureRmSubscription\u0026rdquo; Cmdlet and asks the user to make a selection. The user is able to sort and filter the contents within the grid and the user will be informed of the decision by using \u0026ldquo;Write-Output\u0026rdquo;.\u003c/p\u003e","title":"PowerShell - Using Out-GridView to Select a Parameter"},{"content":"The underfloor heating and cooling system of my apartment is managed by a WTH thermostat. Last week had temperatures of 30+ °C, which is why I wanted to switch from heating to cooling the floor of my apartment. Somehow the instructions in the manual didn\u0026rsquo;t work for me and ended up with error code A2.2. Of course - an error code that\u0026rsquo;s not described in the manual. Below is an image of the WTH D9380 rF-t thermostat I have.\nI was advised to call the company that implemented the underfloor heating and cooling system, but hey.. \u0026ldquo;I\u0026rsquo;m a tech guy!\u0026rdquo;. I found out that there is another icon for winter and summer configuration:\nBoth icons where not visible on my display, just like the example above. I had to dig into the configuration to change the P01 parameter from 0 to 1. You can do that with the following steps (Dutch):\nI know this post was not about IT, but hopefully I can help some people struggling with the same issue. Cheers!\n","permalink":"https://devsecninja.com/2017/06/01/wth-thermostat-error-code-a2.2/","summary":"\u003cp\u003eThe underfloor heating and cooling system of my apartment is managed by a WTH thermostat. Last week had temperatures of 30+ °C, which is why I wanted to switch from heating to cooling the floor of my apartment. Somehow the instructions in the manual didn\u0026rsquo;t work for me and ended up with error code A2.2. Of course - an error code that\u0026rsquo;s not described in the manual. Below is an image of the WTH D9380 rF-t thermostat I have.\u003c/p\u003e","title":"WTH Thermostat - Error Code A2.2"},{"content":"Cause Recently I replaced my workstation and that was a perfect time to rebuild my home lab.\nAfter I got green lights from my employer to install the all new Windows 10 Creators Update, I also installed Hyper-V and started to build servers in my lab. I was playing around with Shielding, Virtual TPM and SecureBoot until I found out that RemoteFX didn\u0026rsquo;t work anymore. I added the RemoteFX adapter to a VM with shielding enabled, but saw in the Hyper-V Settings menu that \u0026ldquo;0 virtual machines are currently using this GPU\u0026rdquo;. I first thought about updating my drivers, but I realized that I was playing around with some new features.\nAfter disabling Shielding for this VM, RemoteFX started to work!\nSolution When RemoteFX doesn\u0026rsquo;t work, check if VM Shielding has been enabled. Disable VM Shielding and test your connection again with RDP.\n0 virtual machines are currently using this GPU\n","permalink":"https://devsecninja.com/2017/05/21/hyper-v-remotefx-doesnt-work-with-shielded-vms/","summary":"\u003ch2 id=\"cause\"\u003eCause\u003c/h2\u003e\n\u003cp\u003eRecently I replaced my workstation and that was a perfect time to rebuild my home lab.\u003c/p\u003e\n\u003cp\u003eAfter I got green lights from my employer to install the all new Windows 10 Creators Update, I also installed Hyper-V and started to build servers in my lab. I was playing around with Shielding, Virtual TPM and SecureBoot until I found out that RemoteFX didn\u0026rsquo;t work anymore. I added the RemoteFX adapter to a VM with shielding enabled, but saw in the Hyper-V Settings menu that \u0026ldquo;0 virtual machines are currently using this GPU\u0026rdquo;. I first thought about updating my drivers, but I realized that I was playing around with some new features.\u003c/p\u003e","title":"Hyper-V RemoteFX doesn't work with Shielded VMs"},{"content":"Recently I bought a new soundbar for my Samsung television. I connected it with TOSLINK/SPDIF (digital audio) to my Samsung television. I didn\u0026rsquo;t like using the second remote control of the soundbar, so I used the \u0026ldquo;Learn\u0026rdquo; functionality from Yamaha to \u0026rsquo;learn\u0026rsquo; the volume and power functions of my Samsung remote control.\nAfter a quick setup I found out that - of course - when you use the television remote to control the volume, both the television and soundbar will adjust to this.\nMy Samsung television lacks the possibility to turn the audio off while still using the volume buttons.\nAfter brainstorming I found a nice workaround for this.\nJust connect a headphone jack into the headphone output of the television.\nYou can use the headphone jack of your old in-ears if you want.\nNow your television should think that you\u0026rsquo;re listening through the headphones instead of the television audio which will:\nEnable you to control the volume of the television (and the soundbar - because of the \u0026lsquo;Learn\u0026rsquo; functionality) Audio will go through the soundbar and through the headphone jack. This was a great workaround for me. Hope this helps you too. Cheers.\n","permalink":"https://devsecninja.com/2017/02/04/how-to-use-a-soundbar-without-audio-from-television/","summary":"\u003cp\u003eRecently I bought a new soundbar for my Samsung television. I connected it with TOSLINK/SPDIF (digital audio) to my Samsung television. I didn\u0026rsquo;t like using the second remote control of the soundbar, so I used the \u0026ldquo;Learn\u0026rdquo; functionality from Yamaha to \u0026rsquo;learn\u0026rsquo; the volume and power functions of my Samsung remote control.\u003c/p\u003e\n\u003cp\u003eAfter a quick setup I found out that - of course - when you use the television remote to control the volume, both the television and soundbar will adjust to this.\u003c/p\u003e","title":"How to use a Soundbar without Audio from Television"},{"content":"2016 has been an amazing year for me. One of the biggest highlights was Microsoft Ignite in Atlanta.\nWith a group of 5 Avanade people we went to the conference to find out where Microsoft was working on for the last couple of months. I\u0026rsquo;m still convinced that I gain most knowledge on conferences like Ignite, because people from Microsoft present the latest innovations.\nAlso a lot of features are planned to go Generally Available (GA) during events like this. I want to thank Avanade for this fantastic opportunity.\nAfter the conference, I went to Washington DC and New York to visit these cities which was very exciting.\nThis was also a year where I worked a lot with the Microsoft Azure platform. I passed the 70-533 and 70-534 exams around Q1 and worked on several projects with Azure.\nThis will be an important skill for 2017 because of the strong adoption of Azure.\nAs of 2016 I\u0026rsquo;ve been promoted to Consultant level within Avanade, which is a big opportunity for me to grow within the firm.\nRight now I\u0026rsquo;m really close to finish the first year of a bachelor\u0026rsquo;s degree, which I worked on for the last couple of years.\nPursuing a bachelor is really important to me on the long-term, because it\u0026rsquo;s something that currently lacks on my resume.\nIt can be a limiting factor when I want to grow to a management position.\nThank you very much for reading my blog posts in 2016.\nHappy New Year!\n","permalink":"https://devsecninja.com/2016/12/31/2016-what-a-year/","summary":"\u003cp\u003e2016 has been an amazing year for me. One of the biggest highlights was Microsoft Ignite in Atlanta.\u003c/p\u003e\n\u003cp\u003eWith a group of 5 Avanade people we went to the conference to find out where Microsoft was working on for the last couple of months. I\u0026rsquo;m still convinced that I gain most knowledge on conferences like Ignite, because people from Microsoft present the latest innovations.\u003c/p\u003e\n\u003cp\u003eAlso a lot of features are planned to go Generally Available (GA) during events like this. I want to thank Avanade for this fantastic opportunity.\u003c/p\u003e","title":"2016 - What a Year!"},{"content":"Some error outputs are not always useful. Especially when they make no sense for the issue you have. Error message: New-AzureRmResourceGroupDeployment : A parameter cannot be found that matches parameter name \u0026lsquo;YOURPARAMETER\u0026rsquo; Solution: This error occurs because of at least the 2 following issues:\nYou didn\u0026rsquo;t specify a parameter for \u0026lsquo;YOURPARAMETER\u0026rsquo; in your JSON template. That\u0026rsquo;s what the error says. If you forget to specify a parameter with the New-AzureRmResourceGroupDeployment cmdlet, you\u0026rsquo;ll see a prompt to insert a value for that parameter. But if you add a paremeter like -Name \u0026ldquo;VM01\u0026rdquo; to the command while it\u0026rsquo;s not specified in the JSON template, you\u0026rsquo;ll see this error. The JSON code you provided isn\u0026rsquo;t valid. Always validate your JSON code. You can use http://www.jsoneditoronline.org/, paste your code and look for the red \u0026ldquo;X\u0026rdquo; buttons after a line number. Did you find another issue where this error occurs? Please let me know in the comments section. Cheers!\n","permalink":"https://devsecninja.com/2016/12/14/azure-a-parameter-cannot-be-found-that-matches-parameter-name/","summary":"\u003cp\u003eSome error outputs are not always useful. Especially when they make no sense for the issue you have. \u003cstrong\u003eError message:\u003c/strong\u003e New-AzureRmResourceGroupDeployment : A parameter cannot be found that matches parameter name \u0026lsquo;YOURPARAMETER\u0026rsquo; \u003cstrong\u003eSolution:\u003c/strong\u003e This error occurs because of at least the 2 following issues:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eYou didn\u0026rsquo;t specify a parameter for \u0026lsquo;YOURPARAMETER\u0026rsquo; in your JSON template. That\u0026rsquo;s what the error says. If you forget to specify a parameter with the New-AzureRmResourceGroupDeployment cmdlet, you\u0026rsquo;ll see a prompt to insert a value for that parameter. But if you add a paremeter like -Name \u0026ldquo;VM01\u0026rdquo; to the command while it\u0026rsquo;s not specified in the JSON template, you\u0026rsquo;ll see this error.\u003c/li\u003e\n\u003cli\u003eThe JSON code you provided isn\u0026rsquo;t valid. Always validate your JSON code. You can use http://www.jsoneditoronline.org/, paste your code and look for the red \u0026ldquo;X\u0026rdquo; buttons after a line number.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eDid you find another issue where this error occurs? Please let me know in the comments section. Cheers!\u003c/p\u003e","title":"Azure - A parameter cannot be found that matches parameter name"},{"content":"Today I met Jeffrey Snover and had a lot of conversations with new IT Pro\u0026rsquo;s. That\u0026rsquo;s what I like about Ignite: connecting with Microsoft Experts and other IT Pro\u0026rsquo;s! Tonight is the Attendee Celebration at the Olympic Park. Tomorrow is the last day at Ignite and the conference will end at 14:00. Below are the sessions I can recommend and followed today:\nUnderstand Credential Security by Paula Januszkiewicz If you can attend a session of Paula, always do it because she has interesting sessions about security. She will demo how cached credentials work and will show you how to get the users credentials with Classic Data Protection API. Paula will demo how to decrypt KeePass if you use Windows User Authentication with the Data Protection API. Paula will show you how to extract credentials from a Windows service. You need access to the registry for this hack. She will show you how you get access to the password in a SID-protected PFX certificate file and how to access Windows with smart card authentication turned on, without a smart card. ProTip from the session: know and limit your domain admins! Domain admins can also do tricks as other users from the domain. Conduct a successful pilot deployment of Microsoft Intune You\u0026rsquo;ll learn how to start a successful pilot and get tips from the field. Follow me on Twitter for live news from Ignite!\n","permalink":"https://devsecninja.com/2016/09/29/microsoft-ignite-day-04-29-09-2016/","summary":"\u003cp\u003eToday I met Jeffrey Snover and had a lot of conversations with new IT Pro\u0026rsquo;s. That\u0026rsquo;s what I like about Ignite: connecting with Microsoft Experts and other IT Pro\u0026rsquo;s! Tonight is the Attendee Celebration at the Olympic Park. Tomorrow is the last day at Ignite and the conference will end at 14:00. Below are the sessions I can recommend and followed today:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003cstrong\u003eUnderstand Credential Security by Paula Januszkiewicz\u003c/strong\u003e\n\u003cul\u003e\n\u003cli\u003eIf you can attend a session of Paula, always do it because she has interesting sessions about security.\u003c/li\u003e\n\u003cli\u003eShe will demo how cached credentials work and will show you how to get the users credentials with Classic Data Protection API.\u003c/li\u003e\n\u003cli\u003ePaula will demo how to decrypt KeePass if you use Windows User Authentication with the Data Protection API.\u003c/li\u003e\n\u003cli\u003ePaula will show you how to extract credentials from a Windows service. You need access to the registry for this hack.\u003c/li\u003e\n\u003cli\u003eShe will show you how you get access to the password in a SID-protected PFX certificate file and how to access Windows with smart card authentication turned on, without a smart card.\u003c/li\u003e\n\u003cli\u003eProTip from the session: know and limit your domain admins! Domain admins can also do tricks as other users from the domain.\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eConduct a successful pilot deployment of Microsoft Intune\u003c/strong\u003e\n\u003cul\u003e\n\u003cli\u003eYou\u0026rsquo;ll learn how to start a successful pilot and get tips from the field.\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003ca href=\"https://twitter.com/JPvR_NL\"\u003eFollow me on Twitter\u003c/a\u003e for live news from Ignite!\u003c/p\u003e","title":"Microsoft Ignite – Day 04 – 29-09-2016"},{"content":"Yes, a new day at the Microsoft Ignite conference! All the sessions are spread across 3 buildings (A, B, C) and I must say that this keeps you fit during the conference. On Monday, my iPhone showed me the following stats for the day:\n[\n](/images/2016/09/ignite-health-stats.png) 18.8 kilometers is around 11.6 miles!\nToday I really enjoyed the session with Jeffrey Snover and Don Jones about PowerShell. I don\u0026rsquo;t know if it was recorded and will be available later, but I can highly recommend it.\nIn the afternoon I met Jason Helmick, which is a really great guy who learned me (with Jeffrey Snover) what PowerShell is and how it can be used.\nOne of the greatest courses you can find on the internet today, is an MVA course with Jeffrey and Jason.\nBefore attending Ignite I was working on a Managed Azure Hosting Solution for one of our customers.\nWe are using Azure ARM Policies to block some type of services from being deployed.\nAzure currently checks during the deployment if it\u0026rsquo;s valid according the ARM Policies.\nSo when I block Ubuntu images, the user will get an error message after he/she deployed the resources.\nThat is really annoying, because the user has to do the deployment all over again with another OS, and he/she needs to provide all the details of the deployment again. I found out that Microsoft is working on a solution, but they couldn\u0026rsquo;t communicate a timeline.\nIt\u0026rsquo;s great to meet the team and companies behind the products you use in your day-to-day life!\nBelow are the sessions I can recommend and followed today:\nExplore PowerShell unplugged with Jeffrey Snover and Don Jones Very funny and cool session about new functionalities in PowerShell V5. You\u0026rsquo;ll also learn about how to grab and structure data from an API from GitHub Get notes from the field: implementing Nano Server in production around the world. I expected a lot of interesting information about how Nano Server works in production and the pitfalls, but didn\u0026rsquo;t find a lot of new information. But still recommended if you want to know more about Azure Stack. Dive into Microsoft Azure Stack Architecture This session was all about the architecture of Microsoft Azure Stack. Found some great new information about the underlying components that are needed, how the infrastructures stays highly available and more about VM sizes. Follow me on Twitter for live news from Ignite!\n","permalink":"https://devsecninja.com/2016/09/28/microsoft-ignite-day-03-28-09-2016/","summary":"\u003cp\u003eYes, a new day at the Microsoft Ignite conference! All the sessions are spread across 3 buildings (A, B, C) and I must say that this keeps you fit during the conference. On Monday, my iPhone showed me the following stats for the day:\u003c/p\u003e\n\u003cp\u003e[\u003c/p\u003e\n\u003cp\u003e\u003cimg alt=\"ignite-health-stats\" loading=\"lazy\" src=\"/images/2016/09/ignite-health-stats.png\"\u003e](/images/2016/09/ignite-health-stats.png) 18.8 kilometers is around 11.6 miles!\u003c/p\u003e\n\u003cp\u003eToday I really enjoyed the session with Jeffrey Snover and Don Jones about PowerShell. I don\u0026rsquo;t know if it was recorded and will be available later, but I can highly recommend it.\u003c/p\u003e","title":"Microsoft Ignite – Day 03 – 28-09-2016"},{"content":"Day 2 of the Microsoft Ignite conference at Atlanta started early for us at 9:00. We had breakfast at 8:00 and where travelling by metro which costs us probably 30 minutes. Below are the sessions I followed today and recommend:\nExplore Microsoft Azure Stack \u0026ldquo;State of the Union\u0026rdquo; - Foundation 1 This was a presentation with a high overview of Azure Stack. Didn\u0026rsquo;t learn a lot about the technology, but some announcements where quite interesting: Azure Stack will be General Available in Mid-CY17. From the template deployment blade in Azure Stack, you can now easily select a QuickStart template from the GitHub QuickStart Template repo. Azure RM Template Validator and Azure RM Policy for Azure Stack can be downloaded from the Azure GitHub repo. You can manage Azure Pack from Azure Stack with an extension. You can download images directly from Azure to Azure Stack. For example, you want the SQL Server image in Azure Stack. Now you can go to the \u0026ldquo;Marketplace\u0026rdquo; blade in Azure Stack and download the bits. Key Vault, Queue Storage and the VPN Gateway features are added to Azure Stack. Discover what\u0026rsquo;s new in device management New lockdown capabilities for kiosk PC\u0026rsquo;s. Create read-only devices. Only allow specific approved USB devices. Block Edge swipe gestures. Learn about Windows 10 Secure Kernel The presenter (Sami Laiho) takes you into a deep-dive about the secure kernel of Windows 10. Very interesting but difficult session. Master Windows 10 Deployments - Expert Level Interesting session with lots of deployment tips and tricks. Windows Containers Containers are slowly introduced to the public with Windows Server 2016. This is definitely something that I\u0026rsquo;ll work on in my home lab shortly. Docker Images can be found at hub.docker.com. E.g. use the Microsoft/IIS docker image to create container with IIS. The presentation slides are shared with docs.com. Between all the sessions I took some time to work with my home lab in Azure and get some hands-on experience with all the new features. You can sit outside in the sun in comfortable seats if you want. I also took some time to visit all the stands in the expo.\nTip: getting \u0026lsquo;free\u0026rsquo; goodies is nice, but understand that by scanning your badge (which is needed), this company will know your first name, last name and your unique number. This can probably be used to send you information (also known as spam :)) by checking your ID with the Ignite database to get your email address. So don\u0026rsquo;t let them scan your badge too often! Looking forward to Day 3 tomorrow! Cheers!\n","permalink":"https://devsecninja.com/2016/09/27/microsoft-ignite-day-02-27-09-2016/","summary":"\u003cp\u003eDay 2 of the Microsoft Ignite conference at Atlanta started early for us at 9:00. We had breakfast at 8:00 and where travelling by metro which costs us probably 30 minutes. Below are the sessions I followed today and recommend:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003cstrong\u003eExplore Microsoft Azure Stack \u0026ldquo;State of the Union\u0026rdquo; - Foundation 1\u003c/strong\u003e\n\u003cul\u003e\n\u003cli\u003eThis was a presentation with a high overview of Azure Stack. Didn\u0026rsquo;t learn a lot about the technology, but some announcements where quite interesting:\n\u003cul\u003e\n\u003cli\u003eAzure Stack will be General Available in Mid-CY17.\u003c/li\u003e\n\u003cli\u003eFrom the template deployment blade in Azure Stack, you can now easily select a QuickStart template from the GitHub QuickStart Template repo.\u003c/li\u003e\n\u003cli\u003eAzure RM Template Validator and Azure RM Policy for Azure Stack can be downloaded from the \u003ca href=\"https://github.com/Azure/AzureStack-Tools\"\u003eAzure GitHub repo\u003c/a\u003e.\u003c/li\u003e\n\u003cli\u003eYou can manage Azure Pack from Azure Stack with an extension.\u003c/li\u003e\n\u003cli\u003eYou can download images directly from Azure to Azure Stack. For example, you want the SQL Server image in Azure Stack. Now you can go to the \u0026ldquo;Marketplace\u0026rdquo; blade in Azure Stack and download the bits.\u003c/li\u003e\n\u003cli\u003eKey Vault, Queue Storage and the VPN Gateway features are added to Azure Stack.\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eDiscover what\u0026rsquo;s new in device management\u003c/strong\u003e\n\u003cul\u003e\n\u003cli\u003eNew lockdown capabilities for kiosk PC\u0026rsquo;s.\n\u003cul\u003e\n\u003cli\u003eCreate read-only devices.\u003c/li\u003e\n\u003cli\u003eOnly allow specific approved USB devices.\u003c/li\u003e\n\u003cli\u003eBlock Edge swipe gestures.\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eLearn about Windows 10 Secure Kernel\u003c/strong\u003e\n\u003cul\u003e\n\u003cli\u003eThe presenter (Sami Laiho) takes you into a deep-dive about the secure kernel of Windows 10. Very interesting but difficult session.\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eMaster Windows 10 Deployments - Expert Level\u003c/strong\u003e\n\u003cul\u003e\n\u003cli\u003eInteresting session with lots of deployment tips and tricks.\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eWindows Containers\u003c/strong\u003e\n\u003cul\u003e\n\u003cli\u003eContainers are slowly introduced to the public with Windows Server 2016. This is definitely something that I\u0026rsquo;ll work on in my home lab shortly.\u003c/li\u003e\n\u003cli\u003eDocker Images can be found at hub.docker.com. E.g. use \u003ca href=\"https://hub.docker.com/r/microsoft/iis/\"\u003ethe Microsoft/IIS docker image\u003c/a\u003e to create container with IIS.\u003c/li\u003e\n\u003cli\u003eThe presentation slides are shared \u003ca href=\"https://docs.com/taylorbrown/1326/windows-containers-ignite\"\u003ewith docs.com\u003c/a\u003e.\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eBetween all the sessions I took some time to work with my home lab in Azure and get some hands-on experience with all the new features. You can sit outside in the sun in comfortable seats if you want. I also took some time to visit all the stands in the expo.\u003c/p\u003e","title":"Microsoft Ignite – Day 02 – 27-09-2016"},{"content":"I\u0026rsquo;m sitting in my hotel room in Buckhead, Atlanta thinking about all the Ignite sessions of today.\nIt\u0026rsquo;s around 23:00 and we will visit Ignite early tomorrow, so I\u0026rsquo;ll keep it short.\nThe day started at the C-building with a nice breakfast.\nThe keynote was presented in the Philips Arena which is the arena of the Atlanta Hawks basketball team.\nWe went to the keynote quite early in the morning but still it was hard to find a good seat.\nWe were sitting at one of the top rings of the large arena.\nMy 3D Selfie that was made on Sunday was being displayed on the big screen before the keynote started!\nThat was really cool and a big surprise!\nMy 3D Selfie was even displayed in the main entrance and during the keynote of Satya Nadella! (Pictures will follow later) During the keynote, the following news was announced:\nWindows Server 2016: General Availability in October 2016 Microsoft added the commercial version of Docker to Windows Server 2016 Second Technical Preview of Azure Stack Windows Defender Application Guard. This new feature uses virtualization technology to open any links clicked on in a sandbox to prevent malicious code from spreading across the network and other devices. Not all sessions that I joined today can be watched from the Ignite website, but I can highly recommend the following sessions:\nOpening Keynote by Scott Guthrie Security with Brad Anderson Cloud Infrastructure with Jason Zander Hope to have some more time tomorrow to write a new blog post.\n","permalink":"https://devsecninja.com/2016/09/27/microsoft-ignite-day-01-26-09-2016/","summary":"\u003cp\u003eI\u0026rsquo;m sitting in my hotel room in Buckhead, Atlanta thinking about all the Ignite sessions of today.\u003c/p\u003e\n\u003cp\u003eIt\u0026rsquo;s around 23:00 and we will visit Ignite early tomorrow, so I\u0026rsquo;ll keep it short.\u003c/p\u003e\n\u003cp\u003eThe day started at the C-building with a nice breakfast.\u003c/p\u003e\n\u003cp\u003eThe keynote was presented in the Philips Arena which is the arena of the Atlanta Hawks basketball team.\u003c/p\u003e\n\u003cp\u003eWe went to \u003ca href=\"https://mediastream.microsoft.com/events/2016/1609/Ignite/player/keynote-am.html\"\u003ethe keynote\u003c/a\u003e quite early in the morning but still it was hard to find a good seat.\u003c/p\u003e","title":"Microsoft Ignite - Day 01 - 26-09-2016"},{"content":"[\n](/images/2016/07/monitor-u3415w-hero1.jpg)This week I received my new monitor, the Dell UltraSharp U3415W Curved as a replacement for my Dell UltraSharp U2312HM Triple Screen setup.\nMy workplace is going to be part of my living room this year, so I don\u0026rsquo;t want a triple screen setup there.\nAfter I placed the screen on my desk, I had a \u0026ldquo;WOW\u0026rdquo; moment.\nAbsolutely beautiful screen and design of the Dell screen.\nDon\u0026rsquo;t expect a full review here, but just some things you should know before you buy this monitor.\nThings you should know No DVI connector. Yes I know, DVI is legacy, but my Dell Docking Station has a DisplayPort, VGA and DVI connector. On my triple screen setup, I connected the DisplayPort with a DP =\u0026gt; HDMI converter to my receiver and the DVI/VGA connectors to two of my monitors.\nWith this new Dell screen, I connect the DisplayPort connector to the Mini-DisplayPort on the Dell screen with the MDP =\u0026gt; DP cable from Dell. I have to use the HDMI cable of my receiver to my notebook separately, instead of using my dock.\nDell Monitor Stands are great, but this screen is quite big. Because of that, I need a stand that\u0026rsquo;s higher because otherwise I get sick after using the monitor for a couple of hours. With DisplayPort, the BitLocker PIN prompt before starting my notebook doesn\u0026rsquo;t show up. So I have to wait for a couple of seconds after booting and guess that my notebook is waiting at the BitLocker PIN prompt so I can type the BitLocker PIN. This is probably an issue with Windows 10 or my Dell notebook, but it\u0026rsquo;s again something you should know. Backlight bleeding in the corners Auto switching between input has been removed from (as far as I know) every new Dell monitor. My U2312HM switches from output to output automatically when I switch from notebook to desktop. No AMD FreeSync or GSync. I noticed this during an iRacing session. It\u0026rsquo;s a bit overdone in this picture below, but I noticed it a couple of times during a lap around the track. This could be very distractive during a race of 2 hours. No, this is not a gaming monitor, but you should know this. Don\u0026rsquo;t get my wrong, this screen is amazing. It\u0026rsquo;s the best screen I\u0026rsquo;ve ever seen with amazing colors because of the factory calibration of the Dell. But you should know that the screen isn\u0026rsquo;t perfect and I had some issues with it.\n","permalink":"https://devsecninja.com/2016/07/30/dell-ultrasharp-u3415w-34-curved/","summary":"\u003cp\u003e[\u003c/p\u003e\n\u003cp\u003e\u003cimg alt=\"Dell U3415W Monitor\" loading=\"lazy\" src=\"/images/2016/07/monitor-u3415w-hero1.jpg\"\u003e](/images/2016/07/monitor-u3415w-hero1.jpg)This week I received my new monitor, the Dell UltraSharp U3415W Curved as a replacement for my Dell UltraSharp U2312HM Triple Screen setup.\u003c/p\u003e\n\u003cp\u003eMy workplace is going to be part of my living room this year, so I don\u0026rsquo;t want a triple screen setup there.\u003c/p\u003e\n\u003cp\u003eAfter I placed the screen on my desk, I had a \u0026ldquo;WOW\u0026rdquo; moment.\u003c/p\u003e\n\u003cp\u003eAbsolutely beautiful screen and design of the Dell screen.\u003c/p\u003e\n\u003cp\u003eDon\u0026rsquo;t expect a full review here, but just some things you should know before you buy this monitor.\u003c/p\u003e","title":"Dell UltraSharp U3415W 34\" Curved"},{"content":"Today I was working on an Azure project where the deployment of Azure resources needed to be automated.\nProblem You\u0026rsquo;ll see the following error message: ERROR: New-AzureRmResourceGroupDeployment : 7:27:39 AM - Resource Microsoft.Compute/virtualMachines/extensions 'test01/dscExtension' failed with message '{ \u0026quot;status\u0026quot;: \u0026quot;Failed\u0026quot;, \u0026quot;error\u0026quot;: { \u0026quot;code\u0026quot;: \u0026quot;ResourceDeploymentFailure\u0026quot;, \u0026quot;message\u0026quot;: \u0026quot;The resource operation completed with terminal provisioning state 'Failed'.\u0026quot;, \u0026quot;details\u0026quot;: [ { \u0026quot;code\u0026quot;: \u0026quot;VMExtensionProvisioningError\u0026quot;, `\u0026ldquo;message\u0026rdquo;: \u0026ldquo;VM has reported a failure when processing extension \u0026lsquo;dscExtension\u0026rsquo;.\nError message: \u0026quot;The DSC``Extension failed to execute: Cannot process argument transformation on parameter \u0026lsquo;ConfigurationFunction\u0026rsquo;.\nCannotconvert value to type System.String..\\r\\nMore information about the failure can be found in the logs located under\u0026lsquo;C:\\WindowsAzure\\Logs\\Plugins\\Microsoft.Powershell.DSC\\2.19.0.0\u0026rsquo; on the VM.\u0026quot;.\u0026quot;}]}}'`\nSolution This issue probably occurs when the ZIP file with the DSC config cannot be found. When you use a program like 7-Zip to zip your PowerShell script, it will create a ZIP file named Yourfile.zip instead of Yourfile.ps1.zip. All of the scripts I\u0026rsquo;ve found in the Quickstart Repository need a .ps1.zip extension. So don\u0026rsquo;t forget to check if your file with script has a .ps1.zip extension.\n","permalink":"https://devsecninja.com/2016/07/24/azure-automation-cannot-process-argument-transformation-on-parameter-configurationfunction/","summary":"\u003cp\u003eToday I was working on an Azure project where the deployment of Azure resources needed to be automated.\u003c/p\u003e\n\u003ch2 id=\"problem\"\u003eProblem\u003c/h2\u003e\n\u003cp\u003eYou\u0026rsquo;ll see the following error message: \u003ccode\u003eERROR: New-AzureRmResourceGroupDeployment : 7:27:39 AM - Resource Microsoft.Compute/virtualMachines/extensions\u003c/code\u003e \u003ccode\u003e'test01/dscExtension' failed with message '{\u003c/code\u003e \u003ccode\u003e\u0026quot;status\u0026quot;: \u0026quot;Failed\u0026quot;,\u003c/code\u003e \u003ccode\u003e\u0026quot;error\u0026quot;: {\u003c/code\u003e \u003ccode\u003e\u0026quot;code\u0026quot;: \u0026quot;ResourceDeploymentFailure\u0026quot;,\u003c/code\u003e \u003ccode\u003e\u0026quot;message\u0026quot;: \u0026quot;The resource operation completed with terminal provisioning state 'Failed'.\u0026quot;,\u003c/code\u003e \u003ccode\u003e\u0026quot;details\u0026quot;: [\u003c/code\u003e \u003ccode\u003e{\u003c/code\u003e \u003ccode\u003e\u0026quot;code\u0026quot;: \u0026quot;VMExtensionProvisioningError\u0026quot;,\u003c/code\u003e `\u0026ldquo;message\u0026rdquo;: \u0026ldquo;VM has reported a failure when processing extension \u0026lsquo;dscExtension\u0026rsquo;.\u003c/p\u003e\n\u003cp\u003eError message: \u0026quot;The DSC``Extension failed to execute: \u003cstrong\u003eCannot process argument transformation on parameter \u0026lsquo;ConfigurationFunction\u0026rsquo;\u003c/strong\u003e.\u003c/p\u003e","title":"Azure Automation - Cannot process argument transformation on parameter 'ConfigurationFunction'"},{"content":"Microsoft is pushing everything to an \u0026ldquo;As a Service\u0026rdquo; model. I think that\u0026rsquo;s great because of - for example - staying in control of licenses and costs.\nMicrosoft recently announced the \u0026ldquo;Windows 10 as a Service\u0026rdquo; and \u0026ldquo;Surface as a Service\u0026rdquo; services.\nThe Surface Pro 4 is a fantastic device, but in my opinion lacks a couple of essential features to classify it as an Enterprise Device. I worked with large organizations and the first 2 check boxes on the acceptance list are:\nThe device needs a Kensington lock The user needs to provide the BitLocker PIN to start the device I think those points are really necessary for an enterprise device. Without a Kensington lock, a device can be easily stolen during a short break. (Or do you take your Surface with you when going to the toilet?!) Most organizations require that a BitLocker PIN is needed to unlock the device.\nIt\u0026rsquo;s possible to use an on screen keyboard during the preboot screen, but I don\u0026rsquo;t see any business using this. Mark Morowczynski from Microsoft says that this is because an attacker can connect to the machine using DMA or retrieve the secrets from memory.\nThe Surface Pro 4 DMA connector is soldered on the motherboard, but the memory can still be easily stolen without a Kensington lock!\nSo what do you think?\nShould Microsoft add the Kensington lock to the Surface Pro 5?\n","permalink":"https://devsecninja.com/2016/07/20/microsoft-surface-pro-an-enterprise-device/","summary":"\u003cp\u003eMicrosoft is pushing everything to an \u0026ldquo;As a Service\u0026rdquo; model. I think that\u0026rsquo;s great because of - for example - staying in control of licenses and costs.\u003c/p\u003e\n\u003cp\u003eMicrosoft recently announced the \u003ca href=\"https://blogs.windows.com/windowsexperience/2016/07/12/announcing-new-subscription-options-for-windows-10-and-surface-for-businesses/\"\u003e\u0026ldquo;Windows 10 as a Service\u0026rdquo; and \u0026ldquo;Surface as a Service\u0026rdquo;\u003c/a\u003e services.\u003c/p\u003e\n\u003cp\u003eThe Surface Pro 4 is a fantastic device, but in my opinion lacks a couple of essential features to classify it as an Enterprise Device. I worked with large organizations and the first 2 check boxes on the acceptance list are:\u003c/p\u003e","title":"Microsoft Surface Pro - An enterprise device?"},{"content":"Speed up your Nexus 7! My Nexus 7 was running very slowly on Android 5.1.1 Lollipop. I\u0026rsquo;ve tried to hard reset the tablet several times, but after a couple of weeks it became very slow again with even 2 apps installed. I followed the 2 steps from the following video of \u0026ldquo;Abs Recon -Solutions\u0026rdquo;. The Dynamic Gesturing option wasn\u0026rsquo;t available on my tablet, but disabling Gesture Typing worked perfectly fine!\n","permalink":"https://devsecninja.com/2016/07/01/nexus-7-2013-is-running-slowly-the-fix/","summary":"\u003cp\u003eSpeed up your Nexus 7! My Nexus 7 was running very slowly on Android 5.1.1 Lollipop. I\u0026rsquo;ve tried to hard reset the tablet several times, but after a couple of weeks it became very slow again with even 2 apps installed. I followed the 2 steps from the following video of \u0026ldquo;Abs Recon -Solutions\u0026rdquo;. The Dynamic Gesturing option wasn\u0026rsquo;t available on my tablet, but disabling Gesture Typing worked perfectly fine!\u003c/p\u003e","title":"Nexus 7 (2013) is running slowly - the fix!"},{"content":"I\u0026rsquo;m so happy to tell you that I\u0026rsquo;m one of the 5 people from Avanade Netherlands that will attend Microsoft Ignite in Atlanta this year!\nWe are all really excited as this is the first time that we attend a big conference like Ignite.\nLast year I followed a lot of online presentations from home, and the high quality of the presentations makes this the best Microsoft event of the world.\nSeeing Jeffrey Snover, Mark Russinovich, Satya Nadella and much more people for the first time is amazing! I\u0026rsquo;m looking forward to meet all my Twitter \u0026rsquo;tweeps\u0026rsquo; as well!\nAll the hotels in a range of 10 kilometers of Ignite where already sold out, so we need to use the subway to visit the Congress Center.\nAfter the event, I\u0026rsquo;m going on a trip with a colleague to Washington DC and New York, because we are a couple of hours away.\nWe are planning to stay for 2.5 days in Washington DC and 4.5 days in New York. I know, we may run out of time, but we just want to see the highlights of those places. A colleague advised us to visit one place instead of two places, but because Washington DC and New York are on route back to Amsterdam in The Netherlands, it will cost us around 200 euros on top of the flight from Atlanta \u0026lt;=\u0026gt; Amsterdam which is quite cheap.\nAny tips you want to share for Washington DC or New York?\nLet me know in the comments section. I want to thank Avanade Netherlands for this fantastic opportunity!\nSee you there!\n","permalink":"https://devsecninja.com/2016/06/26/microsoft-ignite-2016-in-atlanta-here-we-come/","summary":"\u003cp\u003eI\u0026rsquo;m so happy to tell you that I\u0026rsquo;m one of the 5 people from Avanade Netherlands that will attend Microsoft Ignite in Atlanta this year!\u003c/p\u003e\n\u003cp\u003eWe are all really excited as this is the first time that we attend a big conference like Ignite.\u003c/p\u003e\n\u003cp\u003eLast year I followed a lot of online presentations from home, and the high quality of the presentations makes this the best Microsoft event of the world.\u003c/p\u003e","title":"Microsoft Ignite 2016 in Atlanta - Here we come!"},{"content":"My notebook connects to a Docking Station with access to my receiver with speakerset, 2 screens, power and a KVM switch for my mouse and keyboard.\nWhen I lock my laptop, the sounds switches from the receiver to my internal speakers.\nWhen I unlock my laptop, the sound switches back but the Spotify application doesn\u0026rsquo;t play any sound.\nClosing the application doesn\u0026rsquo;t solve this problem, because the application will crash and I have to use the Task Manager to force the application to close. I made a PowerShell function that I\u0026rsquo;ve added to my PowerShell profile. https://gist.github.com/jvravensberg/302a3de8dbc92b54812afc408f5c43ec The Restart-Spotify function looks for any process that ends with \u0026ldquo;spotify\u0026rdquo; and stops the process.\nWhen all the processes are killed, a new instance of Spotify will be opened and the PowerShell console will close itself.\nEven a reinstall of Spotify doesn\u0026rsquo;t help solving this issue I\u0026rsquo;m facing for months now.\nSo the above script is a great workaround for me.\n","permalink":"https://devsecninja.com/2016/06/19/powershell-function-to-restart-a-process/","summary":"\u003cp\u003eMy notebook connects to a Docking Station with access to my receiver with speakerset, 2 screens, power and a KVM switch for my mouse and keyboard.\u003c/p\u003e\n\u003cp\u003eWhen I lock my laptop, the sounds switches from the receiver to my internal speakers.\u003c/p\u003e\n\u003cp\u003eWhen I unlock my laptop, the sound switches back but the Spotify application doesn\u0026rsquo;t play any sound.\u003c/p\u003e\n\u003cp\u003eClosing the application doesn\u0026rsquo;t solve this problem, because the application will crash and I have to use the Task Manager to force the application to close. I made a PowerShell function that I\u0026rsquo;ve added to my PowerShell profile. \u003ca href=\"https://gist.github.com/jvravensberg/302a3de8dbc92b54812afc408f5c43ec\"\u003ehttps://gist.github.com/jvravensberg/302a3de8dbc92b54812afc408f5c43ec\u003c/a\u003e The Restart-Spotify function looks for any process that ends with \u0026ldquo;spotify\u0026rdquo; and stops the process.\u003c/p\u003e","title":"PowerShell Function to Restart a Process"},{"content":"First of all, I absolutely love Let\u0026rsquo;s Encrypt. It\u0026rsquo;s a very easy way to protect a website.\nAll WordPress.com websites are protected with an SSL certificate from Let\u0026rsquo;s Encrypt as well. I received an e-mail this morning from Let\u0026rsquo;s Encrypt about their new Subscriber Agreement.\nAbove the message, there is a big list with 3.125 e-mail addresses including my own e-mail address.\nLooks like they forgot to put those email addresses in the BCC of the email.\nThe e-mail was sent from the Let\u0026rsquo;s Encrypt mailservers because the SPF record is valid: Authentication-Results: spf=pass (sender IP is 198.2.180.10) smtp.mailfrom=mandrillapp.com;\nDear Let\u0026rsquo;s Encrypt Subscriber, We\u0026rsquo;re writing to let you know that we are updating the Let\u0026rsquo;s Encrypt Subscriber Agreement, effective June 30, 2016. You can find the updated agreement (v1.1) as well as the current agreement (v1.0.1) in the \u0026ldquo;Let\u0026rsquo;s Encrypt Subscriber Agreement\u0026rdquo; section of the following page: https://letsencrypt.org/repository/ Thank you for helping to secure the Web by using Let\u0026rsquo;s Encrypt.\nWe\u0026rsquo;re talking about a Certificate Authority here! Hopefully they\u0026rsquo;ll protect the SSL certificates in a better way. UPDATE: Official statement from Let\u0026rsquo;s Encrypt.\n","permalink":"https://devsecninja.com/2016/06/11/lets-encrypt-leaks-3.125-email-addresses/","summary":"\u003cp\u003eFirst of all, I absolutely love Let\u0026rsquo;s Encrypt. It\u0026rsquo;s a very easy way to protect a website.\u003c/p\u003e\n\u003cp\u003eAll \u003ca href=\"http://cloudenius.com/2016/04/09/https-everywhere-encryption-for-all-wordpress-com-sites/\"\u003eWordPress.com websites are protected\u003c/a\u003e with an SSL certificate from Let\u0026rsquo;s Encrypt as well. I received an e-mail this morning from Let\u0026rsquo;s Encrypt about their new Subscriber Agreement.\u003c/p\u003e\n\u003cp\u003eAbove the message, there is a big list with 3.125 e-mail addresses including my own e-mail address.\u003c/p\u003e\n\u003cp\u003eLooks like they forgot to put those email addresses in the BCC of the email.\u003c/p\u003e","title":"Let's Encrypt leaks 3.125 email addresses"},{"content":"My current work phone is a Lumia 925. Because it\u0026rsquo;s 2 years old, I can choose a new phone. This is a very hard choice, because I like the design of the iPhone, but I prefer Android above iOS. It would be an easy pick if I could choose the Nexus 6P, but the iPhone SE and the Nexus 5X are some really good competitors. I\u0026rsquo;ll use this blog post to help with my decision. Maybe it will help you as well.\nDisadvantages Nexus 5X 32 GB vs iPhone SE 16 GB Nexus 5X 32 GB\niPhone SE 16 GB\nDesign - Plastic design vs Aluminium design of the iPhone.\n16 GB Internal Storage - not ideal when shooting 4K. But I use Spotify for music streaming and I upload pictures and videos to Cloud Storage.\nPrivacy - What will Google do with your data? Of course, Apple could use your dat as well, but Google is all about data.\nEcosystem - I\u0026rsquo;ve had an iPhone 4S a couple of years ago, but I like my Nexus 7 tablet I have at the moment.\nSmall battery life - probably need to carry a powerbank with me.\nNo Google Chrome - I use the Google Chrome browser on all my Windows PCs and my Nexus 7 tablet. This will sync my bookmarks and history between devices.\n**Big Display **- The display is quite big. I had a Nexus 5 before my Lumia 925 with a 4.95\u0026quot; display which is fine. The Nexus 5X has a 5.2\u0026quot; display!\nNo Llama-like Apps without Jailbreak - I use this app to mute the phone when it\u0026rsquo;s charging between X time or open an app when connecting to Bluetooth carkit on Android.\nWeight - 136g vs 113g for the iPhone SE.\nNo USB Standard - The Nexus 5X uses USB-C, so I have to replace all my existing cables. But I know when I buy a new Android tablet, I can reuse the cables.\nPerformance Issues - Andrew Martonik, AC\u0026rsquo;s West Coat editor, noted that \u0026ldquo;whether it was opening apps, scrolling heavy webpages or switching between different areas of the phone, everything seems to take a little longer than it should.\u0026rdquo;\n**Small Display **- The iPhone has a 4\u0026quot; display. That\u0026rsquo;s 0.5\u0026quot; smaller than my Lumia 925.\nFront-facing camera - Low quality\nNFC Ring doesn\u0026rsquo;t work - The NFC Ring doesn\u0026rsquo;t work on the iPhone SE.\nLow resolution - Nexus has a Full HD display (1920x1080) vs 1136‑by‑640 for the iPhone SE. BUT, the iPhone is a lot smaller.\nVerdict The Nexus 5X wins for me here. Let me know which phone you should pick or if I forgot to mention a disadvantage!\n","permalink":"https://devsecninja.com/2016/05/28/nexus-5x-32-gb-vs-iphone-se-16-gb/","summary":"\u003cp\u003eMy current work phone is a Lumia 925. Because it\u0026rsquo;s 2 years old, I can choose a new phone. This is a very hard choice, because I like the design of the iPhone, but I prefer Android above iOS. It would be an easy pick if I could choose the Nexus 6P, but the iPhone SE and the Nexus 5X are some really good competitors. I\u0026rsquo;ll use this blog post to help with my decision. Maybe it will help you as well.\u003c/p\u003e","title":"Nexus 5X 32 GB vs iPhone SE 16 GB"},{"content":"In this step-by-step, I\u0026rsquo;ll show you how to configure PfSense with an Azure Site-to-Site VPN by using a Dynamic Routing Gateway/Route-based Gateway. This even works with a VPN behind a NAT setup. I was looking for a stable solution that could handle the new Route-based (IKE v2) Gateways. This tutorial is based on the new Azure Portal.\nPrerequisites A Hyper-V Host (Windows 10 is fine as well) 2 Hyper-V Virtual Networks. One called \u0026ldquo;External Virtual Network\u0026rdquo; and one called \u0026ldquo;Internal Virtual Network\u0026rdquo;. A Hyper-V VM with PfSense installed with NAT configured between the internal and external virtual network. Just download the ISO from the PfSense website and create a Generation 1 VM with it. Give it 512 or 1024 MB RAM and 1 vCPU and follow online installation instructions. Configuration of your Azure Virtual Network \u0026amp; Gateway Go to Portal.Azure.com and sign in to your Azure environment. Create a Virtual Network and use the default settings. Make sure that the address space is not the same as your internal network. Create a Subnet by opening the virtual network you just created and then click on the Subnets button under \u0026ldquo;General\u0026rdquo;. Create a Gateway Subnet by clicking on the \u0026ldquo;+ Gatway Subnet\u0026rdquo; button. In this tutorial, I use: 10.0.0.0/19 address space VM subnet of 10.0.0.0/20 (10.0.0.0 - 10.0.15.255) Gateway subnet of 10.0.16.0/29 (10.0.16.0 - 10.0.16.7). Create a Virtual Network gateway. Give it a name, select your Virtual Network and create a new Public IP address. Select VPN as gateway type and use the Route-based VPN type. Azure will start the deployment of your gateway now. This could take up to an hour, so take a short break. :) After the deployment has been completed, open the Virtual Network Gateway you just created. Click on Settings and Connections. Click on Add to create a connection. Give it a name, choose Site-to-site (IPsec) as the connection type, create a new local network with the Public IP address of the PfSense instance and use a strong \u0026lsquo;password\u0026rsquo; as PSK. The Public IP adress could be an IPv4 Address of a router, which is the gateway of the PfSense VPN VM. Open the PfSense Web Portal. Go to the VPN button in the top menu and open IPsec. Click on \u0026ldquo;Add P1\u0026rdquo;. Use the settings from the Phase 1 table below. Leave other settings as default. You\u0026rsquo;ll see a new entry in the IPsec Tunnels overview. Click on the Show Phase 2 Entries button and add a new P2 entry by clicking on the Add P2 button. Use the settings from the Phase 2 table below. Go to Status, IPsec from the top menu. There you will see the new VPN connection. Click on Connect VPN. Click on F5. You\u0026rsquo;ll see that the status is jumping between ESTABLISHED and CONNECTING or ESTABLISHED X seconds and ESTABLISHED 0 seconds. Give it some time here. It can take a couple of minutes to get this working. Check the logging under Status, System Logs and IPsec. Check if the status of the Connection in Azure is set to \u0026ldquo;Connected\u0026rdquo; as well. (Optional) Don\u0026rsquo;t forget to give your IPsec VM a static MAC Address and IP Address from your router of within the Web Interface. (Optional) Give your PfSense VM a reboot to check if the VPN works after a reboot. Try to RDP to an Azure VM from your Internal Hyper-V network or do a trace from your command line to a VM: tracert 10.0.0.5. Phase 1\nName\nSetting\nKey Exchange Version\nV2\nRemote Gateway\nEnter public IP of VNet gateway\nPre-Shared Key\nEnter the PSK of the connection\nPhase 1 Lifetime\n10800\nPhase 2\nName\nSetting\nLocal Network\nChoose your LAN network here if you are using NAT\nRemote Network\nUse the whole Azure subnet. I use 10.0.0.0/19\nDescription\nFor example, Subnet-1\nProtocol\nESP\nEncryption Algorithms\nOnly select AES / 128 bits\nHash Algorithm\nSHA1\nPFS Key Group\n2 (1024 bit)\n","permalink":"https://devsecninja.com/2016/05/22/configure-azure-vpn-with-pfsense-and-a-dynamic-routing/route-based-gateway/","summary":"\u003cp\u003eIn this step-by-step, I\u0026rsquo;ll show you how to configure PfSense with an Azure Site-to-Site VPN by using a Dynamic Routing Gateway/Route-based Gateway. This even works with a VPN \u003cstrong\u003ebehind a NAT setup\u003c/strong\u003e. I was looking for a stable solution that could handle the new Route-based (IKE v2) Gateways. This tutorial is based on the new Azure Portal.\u003c/p\u003e\n\u003ch2 id=\"prerequisites\"\u003ePrerequisites\u003c/h2\u003e\n\u003cul\u003e\n\u003cli\u003eA Hyper-V Host (Windows 10 is fine as well)\u003c/li\u003e\n\u003cli\u003e2 Hyper-V Virtual Networks. One called \u0026ldquo;External Virtual Network\u0026rdquo; and one called \u0026ldquo;Internal Virtual Network\u0026rdquo;.\u003c/li\u003e\n\u003cli\u003eA Hyper-V VM with PfSense installed with NAT configured between the internal and external virtual network. Just download the ISO from the \u003ca href=\"https://pfsense.org/\"\u003ePfSense website\u003c/a\u003e and create a Generation 1 VM with it. Give it 512 or 1024 MB RAM and 1 vCPU and follow online installation instructions.\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"configuration-of-your-azure-virtual-network--gateway\"\u003eConfiguration of your Azure Virtual Network \u0026amp; Gateway\u003c/h2\u003e\n\u003col\u003e\n\u003cli\u003eGo to \u003ca href=\"https://portal.azure.com\"\u003ePortal.Azure.com\u003c/a\u003e and sign in to your Azure environment.\u003c/li\u003e\n\u003cli\u003eCreate a Virtual Network and use the default settings. Make sure that the address space is not the same as your internal network.\u003c/li\u003e\n\u003cli\u003eCreate a Subnet by opening the virtual network you just created and then click on the \u003cstrong\u003eSubnets\u003c/strong\u003e button under \u0026ldquo;General\u0026rdquo;. Create a \u003cstrong\u003eGateway Subnet\u003c/strong\u003e by clicking on the \u0026ldquo;+ Gatway Subnet\u0026rdquo; button. In this tutorial, I use:\n\u003col\u003e\n\u003cli\u003e10.0.0.0/19 address space\u003c/li\u003e\n\u003cli\u003eVM subnet of 10.0.0.0/20 (10.0.0.0 - 10.0.15.255)\u003c/li\u003e\n\u003cli\u003eGateway subnet of 10.0.16.0/29 (10.0.16.0 - 10.0.16.7).\u003c/li\u003e\n\u003c/ol\u003e\n\u003c/li\u003e\n\u003cli\u003eCreate a Virtual Network gateway. Give it a name, select your Virtual Network and create a new Public IP address. Select \u003cstrong\u003eVPN\u003c/strong\u003e as gateway type and use the \u003cstrong\u003eRoute-based\u003c/strong\u003e VPN type.\u003c/li\u003e\n\u003cli\u003eAzure will start the deployment of your gateway now. This could take up to an hour, so take a short break. :)\u003c/li\u003e\n\u003cli\u003eAfter the deployment has been completed, open the Virtual Network Gateway you just created. Click on \u003cstrong\u003eSettings\u003c/strong\u003e and \u003cstrong\u003eConnections\u003c/strong\u003e. Click on \u003cstrong\u003eAdd\u003c/strong\u003e to create a connection. Give it a name, choose \u003cstrong\u003eSite-to-site (IPsec)\u003c/strong\u003e as the connection type, create a new local network with the Public IP address of the PfSense instance and use a strong \u0026lsquo;password\u0026rsquo; as PSK. The Public IP adress could be an IPv4 Address of a router, which is the gateway of the PfSense VPN VM.\u003c/li\u003e\n\u003cli\u003eOpen the PfSense Web Portal. Go to the \u003cstrong\u003eVPN\u003c/strong\u003e button in the top menu and open \u003cstrong\u003eIPsec\u003c/strong\u003e. Click on \u0026ldquo;\u003cstrong\u003eAdd P1\u003c/strong\u003e\u0026rdquo;.\u003c/li\u003e\n\u003cli\u003eUse the settings from the Phase 1 table below. Leave other settings as default.\u003c/li\u003e\n\u003cli\u003eYou\u0026rsquo;ll see a new entry in the IPsec Tunnels overview. Click on the \u003cstrong\u003eShow Phase 2 Entries\u003c/strong\u003e button and add a new P2 entry by clicking on the \u003cstrong\u003eAdd P2\u003c/strong\u003e button.\u003c/li\u003e\n\u003cli\u003eUse the settings from the Phase 2 table below.\u003c/li\u003e\n\u003cli\u003eGo to \u003cstrong\u003eStatus\u003c/strong\u003e, \u003cstrong\u003eIPsec\u003c/strong\u003e from the top menu. There you will see the new VPN connection. Click on \u003cstrong\u003eConnect VPN\u003c/strong\u003e. Click on F5. You\u0026rsquo;ll see that the status is jumping between \u003cstrong\u003eESTABLISHED\u003c/strong\u003e and \u003cstrong\u003eCONNECTING\u003c/strong\u003e or \u003cstrong\u003eESTABLISHED X seconds\u003c/strong\u003e and \u003cstrong\u003eESTABLISHED 0 seconds\u003c/strong\u003e. Give it some time here. It can take a couple of minutes to get this working. Check the logging under \u003cstrong\u003eStatus,\u003c/strong\u003e \u003cstrong\u003eSystem Logs\u003c/strong\u003e and \u003cstrong\u003eIPsec\u003c/strong\u003e. Check if the status of the \u003cstrong\u003eConnection\u003c/strong\u003e in Azure is set to \u0026ldquo;Connected\u0026rdquo; as well.\u003c/li\u003e\n\u003cli\u003e(Optional) Don\u0026rsquo;t forget to give your IPsec VM a static MAC Address and IP Address from your router of within the Web Interface.\u003c/li\u003e\n\u003cli\u003e(Optional) Give your PfSense VM a reboot to check if the VPN works after a reboot.\u003c/li\u003e\n\u003cli\u003eTry to RDP to an Azure VM from your Internal Hyper-V network or do a trace from your command line to a VM: \u003cstrong\u003etracert 10.0.0.5\u003c/strong\u003e.\u003c/li\u003e\n\u003c/ol\u003e\n\u003cp\u003e \u003c/p\u003e","title":"Configure Azure VPN with PfSense and a Dynamic Routing/Route-based Gateway"},{"content":"WARNING: Removing Windows 10 Apps can make your system unstable. I had issues with my NUC after removing some default applications. Don\u0026rsquo;t do this in your master Enterprise image! Block apps with AppLocker instead. Use the following PowerShell command to check which Windows 10 Apps are installed:\nGet-AppxPackage | Select Name Make sure that you get all the packages that you want to delete in one view. For example:\nGet-AppxPackage | Where {$_.Name -ilike \u0026#34;Microsoft.ZuneVideo\u0026#34; -or $_.Name -ilike \u0026#34;Microsoft.WindowsCamera\u0026#34;} To remove those packages, pipe it to Remove-AppxPackage.\nGet-AppxPackage | Where {$_.Name -ilike \u0026#34;Microsoft.ZuneVideo\u0026#34; -or $_.Name -ilike \u0026#34;Microsoft.WindowsCamera\u0026#34;} | Remove-AppxPackage ","permalink":"https://devsecninja.com/2016/05/15/remove-default-windows-10-apps/","summary":"\u003cp\u003e\u003cstrong\u003eWARNING:\u003c/strong\u003e \u003cstrong\u003eRemoving Windows 10 Apps can make your system unstable.\u003c/strong\u003e I had issues with my NUC after removing some default applications. Don\u0026rsquo;t do this in your master Enterprise image! Block apps with AppLocker instead. Use the following PowerShell command to check which Windows 10 Apps are installed:\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" class=\"chroma\"\u003e\u003ccode class=\"language-powershell\" data-lang=\"powershell\"\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nb\"\u003eGet-AppxPackage\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e \u003cspan class=\"nb\"\u003eSelect \u003c/span\u003e\u003cspan class=\"n\"\u003eName\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003eMake sure that you get all the packages that you want to delete in one view. For example:\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" class=\"chroma\"\u003e\u003ccode class=\"language-powershell\" data-lang=\"powershell\"\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nb\"\u003eGet-AppxPackage\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e \u003cspan class=\"nb\"\u003eWhere \u003c/span\u003e\u003cspan class=\"p\"\u003e{\u003c/span\u003e\u003cspan class=\"nv\"\u003e$_\u003c/span\u003e\u003cspan class=\"p\"\u003e.\u003c/span\u003e\u003cspan class=\"py\"\u003eName\u003c/span\u003e \u003cspan class=\"o\"\u003e-ilike\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Microsoft.ZuneVideo\u0026#34;\u003c/span\u003e \u003cspan class=\"o\"\u003e-or\u003c/span\u003e \u003cspan class=\"nv\"\u003e$_\u003c/span\u003e\u003cspan class=\"p\"\u003e.\u003c/span\u003e\u003cspan class=\"py\"\u003eName\u003c/span\u003e \u003cspan class=\"o\"\u003e-ilike\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Microsoft.WindowsCamera\u0026#34;\u003c/span\u003e\u003cspan class=\"p\"\u003e}\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003eTo remove those packages, pipe it to Remove-AppxPackage.\u003c/p\u003e","title":"Remove default Windows 10 Apps"},{"content":"Recently I passed the 70-533 Implementing Microsoft Azure Infrastructure Solutions exam. I was thinking about stopping with the Azure exams but I couldn\u0026rsquo;t resist it to pass this exam as well.\nToday I passed the 70-534 - Architecting Microsoft Azure Solutions exam with 857 points. I think that the 70-534 exam is a lot easier than the 70-533 exam.\nMost of the questions where questions where you had to select the Azure features that you could use in a specific situation.\nThis blogpost is written with the knowledge needed for the 70-533 exam in mind. [\n](/images/2016/04/70-534_scorereport1.png)\nStudy Resources Pluralsight courses These courses of Orin Thomas are really helpful for getting an overview of the exam objectives. Architecting Azure Solutions (70-534): Infrastructure and Networking Architecting Azure Solutions (70-534): Design an Advanced Application Architecting Azure Solutions (70-534): Secure Resources Architecting Azure Solutions (70-534): Application Storage and Data Access Strategy Microsoft Press books This book, written by Haishi Bai, Steve Maier and Dan Stolts is a good book to get an overview of the exam objectives. Tip: write down the objectives that you don\u0026rsquo;t remember anymore or objectives that you don\u0026rsquo;t understand. Exam Ref 70-534 Architecting Microsoft Azure Solutions Channel 9 Sidney Andrews walks you through all the exam objectives and gives you a nice overview. I didn\u0026rsquo;t watch the full video but I\u0026rsquo;ve only watched content that was relevant to me. 70-534: Architecting Microsoft Azure Solutions Other resources StorSimple on Channel 9 Early Experts Study Guide - take your time for this study guide MeasureUp Practice Exams Check the notes of your 70-533 exam!\nGood luck on passing this exam! Let me know what you think of the exam in the comments section.\n","permalink":"https://devsecninja.com/2016/04/27/azure-70-534-exam-tips-april-2016/","summary":"\u003cp\u003eRecently I passed the \u003ca href=\"http://cloudenius.com/2016/03/20/azure-70-533-exam-tips-march-2016/\"\u003e\u003cstrong\u003e70-533 Implementing Microsoft Azure Infrastructure Solutions\u003c/strong\u003e\u003c/a\u003e exam. I was thinking about stopping with the Azure exams but I couldn\u0026rsquo;t resist it to pass this exam as well.\u003c/p\u003e\n\u003cp\u003eToday I passed the 70-534 - Architecting Microsoft Azure Solutions exam with 857 points. I think that the 70-534 exam is a lot easier than the 70-533 exam.\u003c/p\u003e\n\u003cp\u003eMost of the questions where questions where you had to select the Azure features that you could use in a specific situation.\u003c/p\u003e","title":"Azure 70-534 Exam Tips - April 2016"},{"content":"Problem: Multicast during an SCCM 2012 R2 SP1 (1511 release) Task Sequence fails with error \u0026ldquo;Failed to get MCS key (Code 0x80004005)\u0026rdquo;. This error is found in the smsts.log log file on the (Windows 10 Enterprise x64 1511) client machine.\nSMSTS.log file contents CLibSMSMessageWinHttpTransport::Send: URL: SCCM01.CORP.DOMAIN.COM:443 CCM_POST /SMS_MCS_AltAuth/.sms_mcs?op=keyinfo ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290) In SSL, but with no client cert ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290) `Request was successful.\nApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)pNext != NULL, HRESULT=80004005 (e:\\nts_sccm_release\\sms\\framework\\osdmessaging\\libsmsmessaging.cpp,2054) ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)reply has no message header marker ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)DoRequest (sReply, true), HRESULT=80004005 (e:\\nts_sccm_release\\sms\\framework\\osdmessaging\\libsmsmessaging.cpp,10358) ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)oMcsRequest.GetMCSKey(mcsKeyInfoResponse), HRESULT=80004005 (e:\\nts_sccm_release\\sms\\server\\mcs\\consumer\\mcsisapiclient.cpp,429) ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)Failed to get MCS key (Code 0x80004005) ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)ClientRequestToMCS::DoRequest failed. error = (0x80004005).\nApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)``Request to MCS \u0026lsquo;SCCM01.CORP.DOMAIN.COM\u0026rsquo; failed with error (Code 0x80004005).\nApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)``Multicast OpenSessionRequest failed (0x80004005).\nApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)``Sending status message: SMS_OSDeployment_PackageDownloadMulticastStatusFail ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)`\nSolution Fix 1: could fix your problem but didn\u0026rsquo;t work for me: Uncheck \u0026ldquo;Allow this package to transfered via multicast\u0026rdquo; at packages and your install image properties. Remove the \u0026ldquo;SerializedMCSKey\u0026rdquo; and \u0026ldquo;SignedSerializedMCSKey\u0026rdquo; keys in registry at HKLM\\Software\\SMS\\MCS. Uncheck \u0026ldquo;Enable multicast\u0026rdquo; at the properties of the Distribution Point. Wait for 30 to 60 minutes until multicast is fully removed from the Distribution Point. Check this in the SCCM console under the site roles status view and check the log files on the distribution point. Check \u0026ldquo;Enable multicast\u0026rdquo; at the properties of the Distribution Point. Check if the \u0026ldquo;SerializedMCSKey\u0026rdquo; and \u0026ldquo;SignedSerializedMCSKey\u0026rdquo; keys in registry at HKLM\\Software\\SMS\\MCS are populated again. This could take a couple of minutes or a couple of hours. I couldn\u0026rsquo;t find how to force this. If you know a way, let me know in the comments section. Check \u0026ldquo;Allow this package to transfered via multicast\u0026rdquo; at packages and your install image properties. Wait again for 15 to 30 minutes until the packages are multicast enabled. A background process runs to do this. Microsoft recommended to wait around 15 to 30 minutes. Fix 2: Rebuild your distribution point server by removing the server and installing Windows Server again. This was the only thing that fixed it for me after spending a lot of hours with the Microsoft Support. We enabled multicast during the installation of the environment and disabled it a couple of weeks later. After we wanted to enable it again, this problem started to occur. Let me know in the comments section if this blog post has fixed your issue too! Cheers!\n","permalink":"https://devsecninja.com/2016/04/15/sccm-multicast-on-client-fails-with-error-failed-to-get-mcs-key-code-0x80004005/","summary":"\u003ch3 id=\"problem\"\u003eProblem:\u003c/h3\u003e\n\u003cp\u003eMulticast during an SCCM 2012 R2 SP1 (1511 release) Task Sequence fails with error \u0026ldquo;Failed to get MCS key (Code 0x80004005)\u0026rdquo;. This error is found in the smsts.log log file on the (Windows 10 Enterprise x64 1511) client machine.\u003c/p\u003e\n\u003ch3 id=\"smstslog-file-contents\"\u003eSMSTS.log file contents\u003c/h3\u003e\n\u003cp\u003e\u003ccode\u003eCLibSMSMessageWinHttpTransport::Send: URL: SCCM01.CORP.DOMAIN.COM:443 CCM_POST /SMS_MCS_AltAuth/.sms_mcs?op=keyinfo ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)\u003c/code\u003e \u003ccode\u003eIn SSL, but with no client cert ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)\u003c/code\u003e `Request was successful.\u003c/p\u003e\n\u003cp\u003eApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)\u003ccode\u003epNext != NULL, HRESULT=80004005 (e:\\nts_sccm_release\\sms\\framework\\osdmessaging\\libsmsmessaging.cpp,2054) ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)\u003c/code\u003ereply has no message header marker ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)\u003ccode\u003eDoRequest (sReply, true), HRESULT=80004005 (e:\\nts_sccm_release\\sms\\framework\\osdmessaging\\libsmsmessaging.cpp,10358) ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)\u003c/code\u003eoMcsRequest.GetMCSKey(mcsKeyInfoResponse), HRESULT=80004005 (e:\\nts_sccm_release\\sms\\server\\mcs\\consumer\\mcsisapiclient.cpp,429) ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)\u003ccode\u003eFailed to get MCS key (Code 0x80004005) ApplyOperatingSystem 15-4-2016 9:02:57 656 (0x0290)\u003c/code\u003eClientRequestToMCS::DoRequest failed. error = (0x80004005).\u003c/p\u003e","title":"SCCM - Multicast on client fails with error \"Failed to get MCS key (Code 0x80004005)\""},{"content":"A couple of months ago I decided to migrate my blog from a self-hosted WordPress.org blog to WordPress.com. In this blog post, I\u0026rsquo;ll tell you why I did that.\nNo maintenance required A self-hosted WordPress.org installation needs maintenance. Plugins as well as the WordPress installation needs to be updated multiple times per month. I know there is an automatic update mechanism for WordPress which works fine for a basic installation of WordPress, but there is a chance that the upgrade will break your plugins and you have to restore a back-up to temporarily fix your website.\nNow you can start troubleshooting what happened.\nThat brings us to another point and that is: keeping WordPress up and running with back-ups.\nYou have to restore a back-up every couple of weeks to check if your back-up still works.\nYou have to think about what happens with your blog if the server crashes or - even worse - if the datacenter is on fire.\nPlugins needed for a good WordPress.org installation Plugins are a great way to add functionality to WordPress. On my self-hosted blog, I had some SEO and Caching plugins active because WordPress is really slow without that.\nPlugins can contain bugs as well and could possibly harm your blog.\nWith WordPress.com, your blog will be amazingly fast like mine is right now.\nWordPress.com has some really good DDoS protection to protect your site when it\u0026rsquo;s under attack.\nBy default, WordPress.com has some extra functionality.\nThose features can be installed on a self-hosted WordPress.org installation too, by installing WordPress JetPack.\nBecause I build and maintain a lot of IT environments during work hours, I didn\u0026rsquo;t want to maintain a blog and a web server in my spare time.\nFree SSL As of April 8th, all custom WordPress.com blogs are SSL protected by Let\u0026rsquo;s Encrypt! I requested this feature a couple of months before the introduction, and I think that WordPress received a lot of those requests as well. Google will improve your ranking if your site is protected with SSL. This was one of the reasons for me not to migrate to WordPress.com before. It\u0026rsquo;s nice to see all WordPress.com sites are SSL protected now!\nPricing WordPress.com is free with a *.wordpress.com domain (e.g. jvrtech.wordpress.com).\nIf you want a custom domain for your WordPress site, that starts at +/- $ 18 per year.\nDon\u0026rsquo;t forget to apply for the Privacy Protection for $ 8 per year as well, because this will hide your name in the WHOIS data of your domain.\nThis will protect you from getting a lot of spam from companies that want to maintain or design your website.\nSo that will cost you $ 26 per year in total.\nThat\u0026rsquo;s like $ 2.16 per month.\nIt\u0026rsquo;s amazingly fast, secure and requires no maintenance.\nThe downside of this is that your site will contain ads.\nIf you are just started blogging, that\u0026rsquo;s fine.\nIf you want to remove those ads from your site, you can upgrade your plan to Premium.\nThat will cost you $99 per year, and includes the domain and Privacy Protection.\nThat\u0026rsquo;s $ 8.25 per month.\nMost web hosting companies will charge you like $ 3 per month for the hosting, $ 2 per month for the domain and $ 2 per month for the SSL certificate.\nVerdict WordPress.com + No Maintenance needed. + It\u0026rsquo;s requires little knowledge about hosting or setting up a domain. Your blog will be online within minutes. - No plugins can be installed. - No custom templates can be downloaded from the internet and installed, but you\u0026rsquo;ll find some good free and paid templates in the WordPress.com store. WordPress.com will cost you $ 8.25 per month without ads.\nWordPress.org (Self-Hosted) + Highly customizable. You can make or install any plugins you want and change the lay-out to your needs. If you want a custom template, this is the way to go. + Can be cheap if you search for a cheap hosting company.\nNeeds a caching solution.\nNeeds maintenance and a back-up strategy.\nYou need to know how to setup and maintan a WordPress site on a hosting platform. Hosting for WordPress.org will cost you around $ 5 - 10 per month and can be cheap if you buy hosting from a cheap hosting provider (not recommended) or can be quite expensive if you have a lot of page visits. Happy blogging! Let me know why you are using a self-hosted WordPress.org installation or WordPress.com in the comments section! WordPress didn\u0026rsquo;t pay me to write this blog post. It\u0026rsquo;s just my opinion.\n","permalink":"https://devsecninja.com/2016/04/13/self-hosted-wordpress.org-vs-wordpress.com/","summary":"\u003cp\u003eA couple of months ago I decided to migrate my blog from a self-hosted WordPress.org blog to WordPress.com. In this blog post, I\u0026rsquo;ll tell you why I did that.\u003c/p\u003e\n\u003ch4 id=\"no-maintenance-required\"\u003e\u003cstrong\u003eNo maintenance required\u003c/strong\u003e\u003c/h4\u003e\n\u003cp\u003eA self-hosted WordPress.org installation needs maintenance. Plugins as well as the WordPress installation needs to be updated multiple times per month. I know there is an automatic update mechanism for WordPress which works fine for a basic installation of WordPress, but there is a chance that the upgrade will break your plugins and you have to restore a back-up to temporarily fix your website.\u003c/p\u003e","title":"Self-Hosted WordPress.org vs WordPress.com"},{"content":"I\u0026rsquo;ve requested this feature with many more WordPress Bloggers. Thank you WordPress!\n","permalink":"https://devsecninja.com/2016/04/09/https-everywhere-encryption-for-all-wordpress.com-sites/","summary":"\u003cp\u003eI\u0026rsquo;ve requested this feature with many more WordPress Bloggers. Thank you WordPress!\u003c/p\u003e","title":"HTTPS Everywhere: Encryption for All WordPress.com Sites"},{"content":"Problem: If you see the following error in your IIS Logs (C:\\inetpub\\logs\\LogFiles\\W3SVC1), it\u0026rsquo;s possible that the CRL of your Certificate Authority isn\u0026rsquo;t reachable or valid anymore: GET /SMS_MP/.sms_aut MPLIST 443 - SMS_MP_CONTROL_MANAGER - 403 13 2148081683 5701 18\nSolution: Export a certificate from your personal certificate store, for example, an SCCM Client Certificate to your C: drive. Open a command prompt with elevated rights and type:\ncertutil -url \u0026ldquo;C:\\Certificate.cer\u0026rdquo;\nCheck if the CRL can be verified. Open the CRL manually and check that the BASE and DELTA CRL\u0026rsquo;s aren\u0026rsquo;t expired. In this case, the AD CS service wasn\u0026rsquo;t started and the Delta CRL\u0026rsquo;s were not up-to-date. The service may have been crashed because the startup type was set to \u0026ldquo;Automatic\u0026rdquo;.\n","permalink":"https://devsecninja.com/2016/03/21/sccm-iis-error-code-403-13-2148081683/","summary":"\u003ch2 id=\"problem\"\u003eProblem:\u003c/h2\u003e\n\u003cp\u003eIf you see the following error in your IIS Logs (C:\\inetpub\\logs\\LogFiles\\W3SVC1), it\u0026rsquo;s possible that the CRL of your Certificate Authority isn\u0026rsquo;t reachable or valid anymore: \u003c!-- raw HTML omitted --\u003e GET /SMS_MP/.sms_aut MPLIST 443 - \u003c!-- raw HTML omitted --\u003e SMS_MP_CONTROL_MANAGER - 403 13 2148081683 5701 18\u003c/p\u003e\n\u003ch2 id=\"solution\"\u003eSolution:\u003c/h2\u003e\n\u003cp\u003eExport a certificate from your personal certificate store, for example, an SCCM Client Certificate to your C: drive. Open a command prompt with elevated rights and type:\u003c/p\u003e","title":"SCCM - IIS Error code 403 13 2148081683"},{"content":"Yesterday (Saturday 20-03-2016) I passed the Azure 70-533 (Implementing Microsoft Azure Infrastructure Solutions) exam. This was the second time that I did the exam and I passed with 747 points. So that\u0026rsquo;s a close call. Here are some tips for passing the exam:\nPractice every exam objective in your (Azure) lab. This is very important because you will get some questions in the exam where you need to tell step by step which step you took to install or configure a service. If you don\u0026rsquo;t have a subscription, you could sign up for the free one-month trial. Do the online proctored exam. I absolutely recommend this new way of doing exams. During workdays, It\u0026rsquo;s annoying to leave a project for a couple of hours to do an exam. Pro\u0026rsquo;s: - Study from home, in your trusted environment. No travel time needed and it took away some stress during the exam.\nBook the exam 10/15 minutes before start time. The first time I did the exam, I postponed it 2 times because at the end of the day, I wasn\u0026rsquo;t quite ready or fit. Now it\u0026rsquo;s possible to book the exam when you are ready.\nBook the exam on Saturday. - You are allowed to have a glass of water on your desk when doing the exam. In the exam conditions, they said that it\u0026rsquo;s not allowed but my proctor told me that it\u0026rsquo;s fine.\nCons: - Bye privacy! The recordings of your webcam session with audio and video footage, are in the hands of Microsoft. During your exam, you don\u0026rsquo;t have any privacy rights.\nYou need to install additional software. I heard some horror stories about the Pearson Vue exam software, but it worked very good on my Dell Latitude E6540 work notebook with Windows 10.\nTaking notes is not allowed during the exam.\nStudy from multiple resources. As far as I know, there isn\u0026rsquo;t a course or book that will tell you everything about this exam. I found out that it wasn\u0026rsquo;t easy to say when you are ready for the exam, because of all those different study resources. That\u0026rsquo;s why I failed the first time. When I was studying for the exam, I\u0026rsquo;ve used the following resources: Microsoft Virtual Academy Azure Fundamentals Azure Fundamentals - Websites Azure Fundamentals - Storage and Data Azure IaaS Technical Fundamentals PluralSight Implementing Cloud Services for Azure Infrastructure (70-533) [Course] Implementing Websites for Azure Infrastructure 70-533 [Course] Preparing to Pass the Microsoft Azure 70-533 Exam [Course] Microsoft Press Books If you are an MCP, don\u0026rsquo;t forget to use your MCP Voucher: Exam Ref 70-533 Implementing Microsoft Azure Infrastructure Solutions CloudThat Azure Certification Boot Camp for Exam 70-533 MeasureUp My employer has a contract with MeasureUp. We can use the MeasureUp Practice Exams to check if we are ready to take the exam. Exam changes as of March 10, 2016 BuildAzure - overview of all the changes. The official Microsoft document with changes. Other study resources AzureMan (Blog post of Bert Wolters) BuildAzure Good luck! Let me know what you think of the exam in the comments section.\n","permalink":"https://devsecninja.com/2016/03/20/azure-70-533-exam-tips-march-2016/","summary":"\u003cp\u003eYesterday (Saturday 20-03-2016) I passed the Azure 70-533 (Implementing Microsoft Azure Infrastructure Solutions) exam. This was the second time that I did the exam and I passed with 747 points. So that\u0026rsquo;s a close call. \u003ca href=\"/images/2016/03/70-533-scorereport.png\"\u003e\u003cimg alt=\"70-533-ScoreReport\" loading=\"lazy\" src=\"/images/2016/03/70-533-scorereport.png\"\u003e\u003c/a\u003e Here are some tips for passing the exam:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003cstrong\u003ePractice every exam objective in your (Azure) lab\u003c/strong\u003e. This is very important because you will get some questions in the exam where you need to tell step by step which step you took to install or configure a service. If you don\u0026rsquo;t have a subscription, you could sign up for the \u003ca href=\"https://azure.microsoft.com/en-us/pricing/free-trial/\"\u003efree one-month trial\u003c/a\u003e.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eDo the online proctored exam\u003c/strong\u003e. I absolutely recommend this new way of doing exams. During workdays, It\u0026rsquo;s annoying to leave a project for a couple of hours to do an exam.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003ePro\u0026rsquo;s:\u003c/strong\u003e \u003cstrong\u003e- Study from home, in your trusted environment.\u003c/strong\u003e No travel time needed and it took away some stress during the exam.\u003c/p\u003e","title":"Azure 70-533 Exam Tips - March 2016"},{"content":"If you have an MSDN or Visual Studio subscription, you\u0026rsquo;ll automatically get Azure credits.\nBecause I need to test System Center - Configuration Manager/Virtual Machine Manager/Operations Manager for my work, I have an Visual Studio Enterprise with MSDN subscription.\nThat gives me $150 credit on Azure automatically and makes it very easy to create a (big) home lab for dev/test or study purposes.\nYou can do a lot with 150 euro in Azure, if you turn your virtual machines off when you don\u0026rsquo;t use them. I have a PowerShell script that runs at the end of the day to shutdown all my running virtual machines.\nThat saves me a lot of Azure credits.\nMy lab contains the following virtual machines:\nWindows 10 Test Machine - for testing new Windows 10 Builds Windows Server Containers - for testing the new Windows Server Container features Domain Controller - with Server 2012 R2 Data Protection Manager - to demo the functionalities of DPM DSC Machines - for testing PowerShell DSC RDS - to show how easy it is to deploy an RDS farm Root CA - always turned off and is used when my Root CA has expired SCCM Server - to test the new SCCM Builds SCOM Server - to test SCOM management packs SQL Server - to host the System Center databases. (You can\u0026rsquo;t host the SCCM or SCOM database on an Azure SQL database at the moment.) Do you have an MSDN subscription? You can start testing, developing and training in Azure right now. Let me know how you spend your Azure credits in the comment section!\n","permalink":"https://devsecninja.com/2016/03/13/how-do-you-spend-your-monthly-azure-msdn-credit/","summary":"\u003cp\u003eIf you have an MSDN or Visual Studio subscription, you\u0026rsquo;ll automatically \u003ca href=\"https://azure.microsoft.com/en-us/pricing/member-offers/msdn-benefits-details/\"\u003eget Azure credits\u003c/a\u003e.\u003c/p\u003e\n\u003cp\u003eBecause I need to test System Center - Configuration Manager/Virtual Machine Manager/Operations Manager for my work, I have an Visual Studio Enterprise with MSDN subscription.\u003c/p\u003e\n\u003cp\u003eThat gives me $150 credit on Azure automatically and makes it very easy to create a (big) home lab for dev/test or study purposes.\u003c/p\u003e\n\u003cp\u003eYou can do a lot with 150 euro in Azure, if you turn your virtual machines off when you don\u0026rsquo;t use them. I have a PowerShell script that runs at the end of the day to shutdown all my running virtual machines.\u003c/p\u003e","title":"How do you spend your monthly Azure MSDN credit?"},{"content":"Really good information about the new Azure exam changes.\n","permalink":"https://devsecninja.com/2016/03/09/azure-cert-exam-update-march-10-2016/","summary":"\u003cp\u003eReally good information about the new Azure exam changes.\u003c/p\u003e","title":"Azure Cert Exam Update: March 10, 2016"},{"content":"With TPM 1.2, Microsoft was able to clear the TPM during the SCCM Task Sequence without asking for permission to clear the TPM.\nWith TPM 2.0, SCCM is unable to clear and activate the TPM chip during the deployment.\nThe first time you boot your computer, you need to provide a BitLocker Recovery Key, or the tpm.msc console will tell you that the TPM is ready for use, with reduced functionality. I found a script online that I\u0026rsquo;ve added to my GitHub to clear the TPM 2.0 chip during the deployment.\nYou need to reboot the computer after running this script and it will give a UEFI pop-up during the deployment asking the user for permission to clear the TPM chip. (Physical Presence) I heart from a vendor that Microsoft is working on a workaround to disable the Physical Presence during the deployment.\nYou could experiment with the \u0026ldquo;NoPPIclear\u0026rdquo; TPM setting to disable this physical presence feature next time you deploy a computer.\nYour Task Sequence should look like this: - Run the PowerShell script from the URL above - Restart Computer (You will see the Physical Clearance prompt after the reboot) - Enable BitLocker Task\n","permalink":"https://devsecninja.com/2016/02/14/how-to-clear-a-tpm-2.0-chip-with-sccm-and-powershell/","summary":"\u003cp\u003eWith TPM 1.2, Microsoft was able to clear the TPM during the SCCM Task Sequence without asking for permission to clear the TPM.\u003c/p\u003e\n\u003cp\u003eWith TPM 2.0, SCCM is unable to clear and activate the TPM chip during the deployment.\u003c/p\u003e\n\u003cp\u003eThe first time you boot your computer, you need to provide a BitLocker Recovery Key, or the tpm.msc console will tell you that the TPM is ready for use, with reduced functionality. \u003ca href=\"https://github.com/jvravensberg/PowerShell/blob/master/Windows-General/BitLocker/Reset-TPMOwner.ps1\"\u003eI found a script online that I\u0026rsquo;ve added to my GitHub\u003c/a\u003e to clear the TPM 2.0 chip during the deployment.\u003c/p\u003e","title":"How to Clear a TPM 2.0 chip with SCCM and PowerShell"},{"content":"A very helpful course from Christopher Chapman and Yuri Diogenes. Must see if you want to learn more about Microsoft Advanced Threat Analytics (ATA) and best of all: it\u0026rsquo;s a free course from the Microsoft Virtual Academy! Enterprise Mobility Suite: Beyond \u0026ldquo;Bring Your Own Device\u0026rdquo;\n","permalink":"https://devsecninja.com/2016/02/07/microsoft-advanced-threat-analytics-ata/","summary":"\u003cp\u003eA very helpful course from Christopher Chapman and Yuri Diogenes. Must see if you want to learn more about Microsoft Advanced Threat Analytics (ATA) and best of all: it\u0026rsquo;s a free course from the Microsoft Virtual Academy! \u003ca href=\"https://www.youtube.com/watch?v=vmrg9Gt9ljw\"\u003eEnterprise Mobility Suite: Beyond \u0026ldquo;Bring Your Own Device\u0026rdquo;\u003c/a\u003e\u003c/p\u003e","title":"Microsoft Advanced Threat Analytics (ATA)"},{"content":"When you enable Device Guard or Credential Guard with Hyper-V on your system, your screen will blink every X seconds. This is a really annoying bug and has been fixed by Intel.\nSolution: Upgrade your Intel(R) HD Graphics driver to version 20.19.15.4352.\n","permalink":"https://devsecninja.com/2016/01/25/screen-display-flashes-or-blinks-if-device-guard-or-credential-guard-with-hyper-v-has-been-enabled/","summary":"\u003cp\u003eWhen you enable Device Guard or Credential Guard with Hyper-V on your system, your screen will blink every X seconds. This is a really annoying bug and has been fixed by Intel.\u003c/p\u003e\n\u003ch3 id=\"solution\"\u003eSolution:\u003c/h3\u003e\n\u003cp\u003eUpgrade your Intel(R) HD Graphics driver to version 20.19.15.4352.\u003c/p\u003e","title":"Screen display flashes or blinks if Device Guard or Credential Guard with Hyper-V has been enabled"},{"content":"Because I wanted to configure Device Guard with Windows 10, I need the Hyper-V Hypervisor to be enabled on Windows 10. I tried to do this with DISM and an answer file, but it\u0026rsquo;s not possible to enable Hyper-V during the Task Sequence Deployment because Hyper-V requires a couple of reboots.\nSolution Create a new \u0026ldquo;Set Task Sequence Variable\u0026rdquo; task in your Task Sequence. This will run the PowerShell command after the Task Sequence ends. I\u0026rsquo;ve set this task before enabling the Driver Package, but it should be possible to place this task anywhere you like.\nTask Sequence Variable: SMSTSPostAction Value: powershell -ExecutionPolicy ByPass -Command \u0026ldquo;Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V-Hypervisor -all -NoRestart;Disable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V-Tools-All,Microsoft-Hyper-V-Services -NoRestart\u0026rdquo; This will do the following:\nEnable all the Hyper-V Features after the deployment Remove the Hyper-V Tools and Services (Management Tools) afterwards. I found out that this is the best way to only add the Hyper-V Hypervisor. You still need to reboot the system a few times to enable this feature. Because I enabled the BitLocker PIN, I can\u0026rsquo;t reboot the machine because it will ask for a PIN a few times. Screenshot: ","permalink":"https://devsecninja.com/2016/01/25/enable-hyper-v-during-task-sequence-in-sccm-2012-r2/","summary":"\u003cp\u003eBecause I wanted to configure Device Guard with Windows 10, I need the Hyper-V Hypervisor to be enabled on Windows 10. I tried to do this with DISM and an answer file, but it\u0026rsquo;s not possible to enable Hyper-V during the Task Sequence Deployment because Hyper-V requires a couple of reboots.\u003c/p\u003e\n\u003ch2 id=\"solution\"\u003eSolution\u003c/h2\u003e\n\u003cp\u003eCreate a new \u0026ldquo;Set Task Sequence Variable\u0026rdquo; task in your Task Sequence. This will run the PowerShell command after the Task Sequence ends. I\u0026rsquo;ve set this task before enabling the Driver Package, but it should be possible to place this task anywhere you like.\u003c/p\u003e","title":"Enable Hyper-V during Task Sequence in SCCM 2012 R2"},{"content":"This video, presented by Mark Russinovich and Matt McSpirit, gives you a great overview about the design and architecture of containers. Mark will show you how to create and use the containers.\nDon\u0026rsquo;t forget to subscribe to the Microsoft Mechanics YouTube Channel.\n","permalink":"https://devsecninja.com/2016/01/24/the-basics-of-windows-server-containers/","summary":"\u003cp\u003eThis video, presented by Mark Russinovich and Matt McSpirit, gives you a great overview about the design and architecture of containers. Mark will show you how to create and use the containers.\u003c/p\u003e\n\u003cdiv style=\"position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;\"\u003e\n      \u003ciframe allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share; fullscreen\" loading=\"eager\" referrerpolicy=\"strict-origin-when-cross-origin\" src=\"https://www.youtube.com/embed/YoA_MMlGPRc?autoplay=0\u0026amp;controls=1\u0026amp;end=0\u0026amp;loop=0\u0026amp;mute=0\u0026amp;start=0\" style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;\" title=\"YouTube video\"\u003e\u003c/iframe\u003e\n    \u003c/div\u003e\n\n\u003cp\u003eDon\u0026rsquo;t forget to subscribe to the \u003ca href=\"https://www.youtube.com/channel/UCJ9905MRHxwLZ2jeNQGIWxA\"\u003eMicrosoft Mechanics YouTube Channel\u003c/a\u003e.\u003c/p\u003e","title":"The basics of Windows Server Containers"},{"content":"Recently I connected System Center - Virtual Machine Manager with WSUS. The WSUS server is installed on the primary site server of my SCCM 2012 R2 SP1 CU2 installation.\nAfter I configured my SCCM WSUS server as an update server for VMM, the distribution point in the office stopped working.\nYou will see HTTP ERROR \u0026ldquo;12030\u0026rdquo; in your logs and the PXE request on a client will fail.\nBrowsing to the website of the SCCM Primary Site server will fail too. I found out that the certificate of IIS on my primary site was gone.\nThere was no certificate selected for the Default Website.\nAfter adding the certificate again and restarting IIS, PXE started to work again.\n","permalink":"https://devsecninja.com/2016/01/21/sccm-pxe-stopped-working-after-configuring-update-server-in-vmm/","summary":"\u003cp\u003eRecently I connected System Center - Virtual Machine Manager with WSUS. The WSUS server is installed on the primary site server of my SCCM 2012 R2 SP1 CU2 installation.\u003c/p\u003e\n\u003cp\u003eAfter I configured my SCCM WSUS server as an update server for VMM, the distribution point in the office stopped working.\u003c/p\u003e\n\u003cp\u003eYou will see HTTP ERROR \u0026ldquo;12030\u0026rdquo; in your logs and the PXE request on a client will fail.\u003c/p\u003e\n\u003cp\u003eBrowsing to the website of the SCCM Primary Site server will fail too. I found out that the certificate of IIS on my primary site was gone.\u003c/p\u003e","title":"SCCM - PXE stopped working after configuring Update Server in VMM"},{"content":"Recently I found the following error in the SMSPXE.log log file on my newly created distribution point: CryptVerifySignature failed, 80090006 SMSPXE \u0026lt;REMOVED TIME\u0026gt; 2500 (0x09C4) untrusted certificate: \u0026lt;REMOVED CERTIFICATE\u0026gt; SMSPXE \u0026lt;REMOVED TIME\u0026gt; 2500 (0x09C4) Failed to get information for MP: https://SCCMPRIMARY.DOMAIN.TLD. 80090006. SMSPXE \u0026lt;REMOVED TIME\u0026gt; 2500 (0x09C4) After recreating my certificate template for the IIS Service on the primary site server, it fixed the problem.\nCheck the online documentation of SCCM for the details of this certificate template.\n","permalink":"https://devsecninja.com/2016/01/21/sccm-smspxe.log-shows-untrusted-certificate/","summary":"\u003cp\u003eRecently I found the following error in the SMSPXE.log log file on my newly created distribution point: \u003ccode\u003eCryptVerifySignature failed, 80090006 SMSPXE \u0026lt;REMOVED TIME\u0026gt; 2500 (0x09C4) untrusted certificate: \u0026lt;REMOVED CERTIFICATE\u0026gt; SMSPXE \u0026lt;REMOVED TIME\u0026gt; 2500 (0x09C4) Failed to get information for MP: https://SCCMPRIMARY.DOMAIN.TLD. 80090006. SMSPXE \u0026lt;REMOVED TIME\u0026gt; 2500 (0x09C4)\u003c/code\u003e After recreating my certificate template for the IIS Service on the primary site server, it fixed the problem.\u003c/p\u003e\n\u003cp\u003eCheck the online documentation of SCCM for the details of this certificate template.\u003c/p\u003e","title":"SCCM - SMSPXE.log shows Untrusted certificate"},{"content":"I had a problem with Spotify on my notebook, connected with HDMI to my Pioneer receiver. I\u0026rsquo;m using Windows 10 with the 10586 Build. When I wanted to play music after I paused Spotify for a couple of minutes, the music doesn\u0026rsquo;t play again. So I created a small PowerShell script that kills all the Spotify instances, but it\u0026rsquo;s a workaround, not a solution:\nGet-Process *spotify | Stop-Process Solution Right click the Speaker in your taskbar. Click on \u0026ldquo;Sounds\u0026rdquo;. Go to the \u0026ldquo;Playback\u0026rdquo; tab and click on your audio device. Push the \u0026ldquo;Properties\u0026rdquo; button and go to the \u0026ldquo;Advanced\u0026rdquo; tab. Under \u0026ldquo;Exclusive Mode\u0026rdquo;, untick \u0026ldquo;Allow applications to take exclusive control of this device\u0026rdquo;.\n","permalink":"https://devsecninja.com/2016/01/02/spotify-no-sound-after-pausing-in-windows-10/","summary":"\u003cp\u003eI had a problem with Spotify on my notebook, connected with HDMI to my Pioneer receiver. I\u0026rsquo;m using Windows 10 with the 10586 Build. When I wanted to play music after I paused Spotify for a couple of minutes, the music doesn\u0026rsquo;t play again. So I created a small PowerShell script that kills all the Spotify instances, but it\u0026rsquo;s a workaround, not a solution:\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" class=\"chroma\"\u003e\u003ccode class=\"language-powershell\" data-lang=\"powershell\"\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"nb\"\u003eGet-Process\u003c/span\u003e \u003cspan class=\"p\"\u003e*\u003c/span\u003e\u003cspan class=\"n\"\u003espotify\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e \u003cspan class=\"nb\"\u003eStop-Process\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003ch2 id=\"solution\"\u003eSolution\u003c/h2\u003e\n\u003cp\u003eRight click the Speaker in your taskbar. Click on \u0026ldquo;Sounds\u0026rdquo;. Go to the \u0026ldquo;Playback\u0026rdquo; tab and click on your audio device. Push the \u0026ldquo;Properties\u0026rdquo; button and go to the \u0026ldquo;Advanced\u0026rdquo; tab. Under \u0026ldquo;Exclusive Mode\u0026rdquo;, untick \u0026ldquo;Allow applications to take exclusive control of this device\u0026rdquo;.\u003c/p\u003e","title":"Spotify - No sound after pausing in Windows 10"},{"content":"Today I created a user policy in an OU where Loopback Processing was applied and where Security Filtering was set to my account to test the policy. The policy didn\u0026rsquo;t show up in the RSOP data (gpresult /h report1.html) and the policy was not getting applied.\nSolution: Give the Domain Computers (or the group with the computer accounts from the OU) permission to read the GPO. Because of Loopback Processing, the computer account will be used to read the GPO, instead of the user account. You can still give the Domain Computers permission to read the GPO only, and add a user or group to the Security Filtering section to make sure that the GPO will be applied to that group or users. Cheers!\n","permalink":"https://devsecninja.com/2015/12/22/user-policy-not-applied-with-security-filtering-and-loopback-processing/","summary":"\u003cp\u003eToday I created a user policy in an OU where Loopback Processing was applied and where Security Filtering was set to my account to test the policy. The policy didn\u0026rsquo;t show up in the RSOP data (gpresult /h report1.html) and the policy was not getting applied.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSolution:\u003c/strong\u003e Give the Domain Computers (or the group with the computer accounts from the OU) permission to read the GPO. Because of Loopback Processing, the computer account will be used to read the GPO, instead of the user account. You can still give the Domain Computers permission to read the GPO only, and add a user or group to the Security Filtering section to make sure that the GPO will be applied to that group or users. Cheers!\u003c/p\u003e","title":"User Policy not applied with Security Filtering and Loopback Processing"},{"content":"Problem: Sometimes it\u0026rsquo;s possible that the registry keys SerializedMCSKey and SignedSerializedMCSKey in the HKEY_LOCAL_MACHINE\\SOFTWARE\\Microsoft\\SMS\\MCS location are empty after a fresh installation or after reinstalling multicast.\nSolution: Patience\u0026hellip; It took like 5 or 6 hours to get those values populated by SCCM / WDS. I\u0026rsquo;ve searched for a way to force this, but I couldn\u0026rsquo;t find anything online. I\u0026rsquo;ve tried to reboot both machines, without any luck. If you know a way to force this, please let me know.\n","permalink":"https://devsecninja.com/2015/12/14/serializedmcskey-and-signedserializedmcskey-registry-keys-are-empty-sccm/","summary":"\u003cp\u003e\u003cstrong\u003eProblem:\u003c/strong\u003e Sometimes it\u0026rsquo;s possible that the registry keys SerializedMCSKey and SignedSerializedMCSKey in the HKEY_LOCAL_MACHINE\\SOFTWARE\\Microsoft\\SMS\\MCS location are empty after a fresh installation or after reinstalling multicast.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSolution:\u003c/strong\u003e Patience\u0026hellip; It took like 5 or 6 hours to get those values populated by SCCM / WDS. I\u0026rsquo;ve searched for a way to force this, but I couldn\u0026rsquo;t find anything online. I\u0026rsquo;ve tried to reboot both machines, without any luck. If you know a way to force this, please let me know.\u003c/p\u003e","title":"SerializedMCSKey and SignedSerializedMCSKey registry keys are empty - SCCM"},{"content":"Problem: In SCCM 2012 R2 SP1 CU2, I\u0026rsquo;ve created a package that deploys some files such as wallpapers with a .BAT file. When I check the execmgr.log, I see the following error:\nScript for Package:PR######, Program: Run Script failed with exit code 4.\nSolution: Under the program in SCCM, change \u0026ldquo;Run\u0026rdquo; from \u0026ldquo;Hidden\u0026rdquo; to \u0026ldquo;Normal\u0026rdquo;.\n","permalink":"https://devsecninja.com/2015/12/10/script-for-package-failed-with-exit-code-4-in-sccm-2012-r2/","summary":"\u003cp\u003e\u003cstrong\u003eProblem:\u003c/strong\u003e In SCCM 2012 R2 SP1 CU2, I\u0026rsquo;ve created a package that deploys some files such as wallpapers with a .BAT file. When I check the execmgr.log, I see the following error:\u003c/p\u003e\n\u003cblockquote\u003e\n\u003cp\u003eScript for Package:PR######, Program: Run Script failed with exit code 4.\u003c/p\u003e\n\u003c/blockquote\u003e\n\u003cp\u003e\u003cstrong\u003eSolution:\u003c/strong\u003e Under the program in SCCM, change \u0026ldquo;Run\u0026rdquo; from \u0026ldquo;Hidden\u0026rdquo; to \u0026ldquo;Normal\u0026rdquo;.\u003c/p\u003e","title":"Script for package failed with exit code 4 in SCCM 2012 R2"},{"content":"I had some issues with the newest Windows ADK (1511) with Configuration Manager 2012 R2 SP1 CU2. As of yesterday, it\u0026rsquo;s possible to download the newest version of Configuration Manager: 1511. Because I had issues with the newest ADK, I\u0026rsquo;ve asked Microsoft on Technet if we still need to use the older ADK (10.0.26624.0) or if we can use the newest 1511 version of the ADK: [\n](/images/2015/12/adkversion.png) Check the post mentioned in the image above to download the right ADK version when you are going to use the new Configuration Manager 1511.\n","permalink":"https://devsecninja.com/2015/12/09/which-windows-adk-version-to-use-with-configuration-manager-1511/","summary":"\u003cp\u003eI had some issues with the newest Windows ADK (1511) with Configuration Manager 2012 R2 SP1 CU2. As of yesterday, it\u0026rsquo;s possible to download the newest version of Configuration Manager: 1511. Because I had issues with the newest ADK, I\u0026rsquo;ve asked Microsoft on Technet if we still need to use the older ADK (10.0.26624.0) or if we can use the newest 1511 version of the ADK: [\u003c/p\u003e\n\u003cp\u003e\u003cimg alt=\"ADKVersion\" loading=\"lazy\" src=\"/images/2015/12/adkversion.png\"\u003e](/images/2015/12/adkversion.png) \u003ca href=\"http://blogs.technet.com/b/configmgrteam/archive/2015/11/20/issue-with-the-windows-adk-for-windows-10-version-1511.aspx\"\u003eCheck the post mentioned in the image above\u003c/a\u003e to download the right ADK version when you are going to use the new Configuration Manager 1511.\u003c/p\u003e","title":"Which Windows ADK version to use with Configuration Manager 1511?"},{"content":"I was capturing a new Windows 10 TH2 (1511) image with SCCM 2012 R2 SP1 CU2 when suddenly the capturing process stops and ends with a Blue Screen of Death: \u0026ldquo;SYSTEM_THREAD_EXCEPTION_NOT_HANDLED\u0026rdquo;.\nCurrent environment: SCCM 2012 R2 SP1 CU2 Primary Site Local Distribution Point Windows Server 2012 R2 OS Based on Hyper-V 2008 R2 and 2012 R2.\nWindows 10 Template on Hyper-V Server 2008 R2 Cluster with VM Version 1.\nFirstly I thought that the boot image was corrupt or not working, so I tried to recreate the image using the following post.\nUnfortunately, the BSOD comes up with both boot images.\nSolution: Use a Generation 2 VM instead of a Generation 1 VM.\n","permalink":"https://devsecninja.com/2015/12/08/bsod-when-capturing-image-with-sccm-2012-r2-sp1/","summary":"\u003cp\u003eI was capturing a new Windows 10 TH2 (1511) image with SCCM 2012 R2 SP1 CU2 when suddenly the capturing process stops and ends with a Blue Screen of Death: \u0026ldquo;SYSTEM_THREAD_EXCEPTION_NOT_HANDLED\u0026rdquo;.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eCurrent environment:\u003c/strong\u003e SCCM 2012 R2 SP1 CU2 Primary Site Local Distribution Point Windows Server 2012 R2 OS Based on Hyper-V 2008 R2 and 2012 R2.\u003c/p\u003e\n\u003cp\u003eWindows 10 Template on Hyper-V Server 2008 R2 Cluster with VM Version 1.\u003c/p\u003e\n\u003cp\u003eFirstly I thought that the boot image was corrupt or not working, so I tried to recreate the image using \u003ca href=\"http://blogs.technet.com/b/brandonlinton/archive/2015/07/30/windows-10-adk-boot-image-updates-for-configuration-manager.aspx\"\u003ethe following post\u003c/a\u003e.\u003c/p\u003e","title":"BSOD when capturing image with SCCM 2012 R2 SP1"},{"content":"Problem: When you upload your new Windows 10 ADK boot image to SCCM 2012 R2 SP1, the process ends with the error \u0026ldquo;The specified file is invalid. Only maximum compression type is supported\u0026rdquo;. Solution: Download the newest Windows ADK for Windows 10, Version 1511 here at Microsoft.\n","permalink":"https://devsecninja.com/2015/12/08/the-specified-file-is-invalid.-only-maximum-compression-type-is-supported-sccm-2012-r2-sp1/","summary":"\u003cp\u003e\u003cstrong\u003eProblem:\u003c/strong\u003e When you upload your new Windows 10 ADK boot image to SCCM 2012 R2 SP1, the process ends with the error \u0026ldquo;The specified file is invalid. Only maximum compression type is supported\u0026rdquo;. \u003cstrong\u003eSolution:\u003c/strong\u003e Download the newest Windows ADK for Windows 10, Version 1511 \u003ca href=\"https://msdn.microsoft.com/en-us/windows/hardware/dn913721(v=vs.8.5).aspx\"\u003ehere at Microsoft\u003c/a\u003e.\u003c/p\u003e","title":"\"The specified file is invalid. Only maximum compression type is supported\" - SCCM 2012 R2 SP1"},{"content":"Recently I had an issue with my work notebook, a Dell Latitude E6540. After upgrading to the new Windows 10 TH2 version, my Bluetooth keyboard and headset started couldn\u0026rsquo;t connect after a shutdown. I had to reconnect the Bluetooth devices again to temporarily fix it. I was looking for a recent driver for my Intel Centrino Advanced-N 6235 network card, but it was already up-to-date because of a manual Windows Updates action I did after the upgrade to TH2. I realized that my \u0026ldquo;Intel(R) Wireless Bluetooth(R)\u0026rdquo; wasn\u0026rsquo;t up-to-date.\nThe driver date was from 2013. I don\u0026rsquo;t know why Microsoft or Intel didn\u0026rsquo;t update that driver during the installation of the Windows Updates.\nThis was an easy fix:\nOpen Device Manager. Find the \u0026ldquo;Intel(R) Wireless Bluetooth(R)\u0026rdquo; under the Bluetooth section. Right click on the on the \u0026ldquo;Intel(R) Wireless Bluetooth(R)\u0026rdquo; item and click on \u0026ldquo;Update driver software\u0026rdquo;. Voila, Microsoft installed the 2015 driver for me which fixes the problem. Cheers!\n","permalink":"https://devsecninja.com/2015/12/02/windows-10-intel-bluetooth-cant-connect-to-devices-after-reboot-in-th2/","summary":"\u003cp\u003eRecently I had an issue with my work notebook, a Dell Latitude E6540. After upgrading to the new Windows 10 TH2 version, my Bluetooth keyboard and headset started couldn\u0026rsquo;t connect after a shutdown. I had to reconnect the Bluetooth devices again to temporarily fix it. I was looking for a recent driver for my Intel Centrino Advanced-N 6235 network card, but it was already up-to-date because of a manual Windows Updates action I did after the upgrade to TH2. I realized that my \u0026ldquo;Intel(R) Wireless Bluetooth(R)\u0026rdquo; wasn\u0026rsquo;t up-to-date.\u003c/p\u003e","title":"Windows 10 - Intel Bluetooth can't connect to devices after reboot in TH2"},{"content":"During an Operating System Deployment in SCCM 2012 R2, new notebook models crashed after installing the driver package. I couldn\u0026rsquo;t see the BSOD code because I wasn\u0026rsquo;t able to boot the computer in safe mode, but after taking some pictures with my phone of the BSOD screen, I found out that the error code was 0x0000007e.\nThe next step: how are you going to troubleshoot 160 drivers that you deployed to those models with your new driver package?\nYou could delete the original driver package and create a new one and insert the drivers one by one.\nThat could be very time consuming.\nInstalling multiple drivers with dpinst.exe Our driver package was nicely ordered by the manufacturer. So I had like 15 folders per driver set. For example: - Audio - Bluetooth - LAN - WLAN I want to troubleshoot all the drivers in those folders because it\u0026rsquo;s easier to tell the manufacturer that they need to fix there \u0026ldquo;Audio\u0026rdquo; drivers rather than telling that the task sequence crashes on the driver package because of a faulty driver.\nDownload and install the Windows Driver Kit. The Windows 8.1 kit is for this purpose compatible with Windows 7. Don\u0026rsquo;t forget to check the installation directory of the kit during the installation. We need to copy dpinst.exe from the \\redist\\DIFx subdirectory. Copy it to a share that is easy to map with the \u0026ldquo;net use\u0026rdquo; command later. For example: \\\\Server\\Share\\dpinst.exe Create a DPInst.xml file with UTF-8 encoding. UTF-8 encoding is an option in the Save As screen in Notepad. Add the following code to the DPInst.xml file to troubleshoot the Audio driver folder: \u0026lt;?xml version=\u0026#34;1.0\u0026#34; ?\u0026gt; \u0026lt;dpinst\u0026gt; \u0026lt;!-- The following search and subDirectory elements direct DPInst to search all subdirectories (under the DPInst working directory) to locate driver packages. --\u0026gt; \u0026lt;search\u0026gt; \u0026lt;subDirectory\u0026gt;\u0026lt;strong\u0026gt;Audio\u0026lt;/strong\u0026gt;\u0026lt;/subDirectory\u0026gt; \u0026lt;/search\u0026gt; \u0026lt;/dpinst\u0026gt; Copy the contents of your driver package to your share, for example \\Server\\Share. Make sure you have a structure like \\Server\\Share\\Audio, to troubleshoot the audio drivers. Change the task sequence to \u0026ldquo;Auto Apply Drivers\u0026rdquo; so that your device will miss some drivers, but your deployment should finish. You have a test device after the deployment that can be used for further troubleshooting. On your test device, open a PowerShell prompt and run the following code. or create a small script: net use L: \\\\Server\\Share L:\\dpinst.exe /F After running the code above, you\u0026rsquo;ll see a screen where the drivers are being installed. After installing the drivers, reboot your test device. Do this again for the next driver folder: Change the subdirectory in the DPInst.xml file from Audio to for example Bluetooth to test the Bluetooth drivers. Repeat the PowerShell code, install the drivers and reboot the test device again to find out which driver folder is causing the Blue Screen of Death. Cheers!\n","permalink":"https://devsecninja.com/2015/12/01/troubleshoot-bsod-after-deploying-driver-package-in-sccm-2012-r2/","summary":"\u003cp\u003eDuring an Operating System Deployment in SCCM 2012 R2, new notebook models crashed after installing the driver package. I couldn\u0026rsquo;t see the BSOD code because I wasn\u0026rsquo;t able to boot the computer in safe mode, but after taking some pictures with my phone of the BSOD screen, I found out that the error code was \u003cstrong\u003e0x0000007e\u003c/strong\u003e.\u003c/p\u003e\n\u003cp\u003eThe next step: how are you going to troubleshoot 160 drivers that you deployed to those models with your new driver package?\u003c/p\u003e","title":"Troubleshoot BSOD after deploying driver package in SCCM 2012 R2"},{"content":"Today I had an issue with one of our new notebook models at a client.\nProblem: After you PXE-boot the device, the device doesn\u0026rsquo;t get his advertisement from SCCM and goes in a reboot to the original OS.\nIf you open the command prompt with F8 during the boot image startup process, no IP address is shown after you type in the command \u0026ldquo;ipconfig\u0026rdquo;.\nThe driver package from the manufacturer we used included a Windows 7 NIC driver, which was good for the install image.\nIn our System Center - Configuration Manager 2012 R2 (SCCM / ConfigMgr) environment, we used a WinPE version of 6.3.9600.16384 which is equal to Windows 8.1.\nThe \u0026ldquo;Intel(R) Ethernet Connection I217-LM\u0026rdquo; Windows 7 driver (NDIS62) doesn\u0026rsquo;t work with the Windows 8.1 Boot Image (WinPE) we used.\nThis worked fine for older models like the \u0026ldquo;Intel(R) Ethernet Connection I217-LM\u0026rdquo; and the \u0026ldquo;Intel(R) Ethernet Connection I218-LM\u0026rdquo;.\nSolution: Add the NDIS64 as well as the NDIS62 version of the driver to your SCCM boot image (WinPE). After updating the boot image, your deployment will work again.\nUpdate: Check the comments below as well for some tips! Cheers!\n","permalink":"https://devsecninja.com/2015/11/25/intel-i219-v-ethernet-connection-driver-doesnt-work-in-winpe-sccm/","summary":"\u003cp\u003eToday I had an issue with one of our new notebook models at a client.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eProblem:\u003c/strong\u003e After you PXE-boot the device, the device doesn\u0026rsquo;t get his advertisement from SCCM and goes in a reboot to the original OS.\u003c/p\u003e\n\u003cp\u003eIf you open the command prompt with F8 during the boot image startup process, no IP address is shown after you type in the command \u0026ldquo;ipconfig\u0026rdquo;.\u003c/p\u003e\n\u003cp\u003eThe driver package from the manufacturer we used included a Windows 7 NIC driver, which was good for the install image.\u003c/p\u003e","title":"Intel I219-V Ethernet Connection driver doesn't work in WinPE (SCCM)"},{"content":"In my previous post PowerShell Profiles - The profile.ps1 file I showed you my profile.ps1 file.\nIn this post, I\u0026rsquo;ll show you a way to structure your base file, so that you can use it for your functions and aliases.\nMake sure that you always use max 2 files.\nThe first file is your profile.ps1 file and the other file is this _PSH_BASE.ps1 file.\nIf you use like 3 or 4 files, it can take a couple of seconds to load your PowerShell session.\nStart with the header:\n######################################################################################### # File Name is: _PSH_BASE.ps1 ######################################################################################### # Powershell Profile Base - Updated:30/08/2015 @ 14:00 ######################################################################################### It contains some basic stuff, but it\u0026rsquo;s not necessary.\n# Set the Home Directory for Powershell to start in Set-Location $env:ScriptHome This part will start your PowerShell session in your script home. More information about this environment variable in my previous blog post.\n## Profile Functions ################################### #region Functions # PLACE FUNCTIONS HERE #endregion Because I have a lot of functions in my PowerShell Profile, it\u0026rsquo;s great to use a region so that you can hide your code with the + sign in PowerShell ISE like this: ## Profile Aliases ################################### Set-Alias -Name list -Value dir/w | Out-Null This is the place where you can define your profile aliases. For example, if you type \u0026ldquo;list\u0026rdquo; in the console, it will do the \u0026ldquo;dir/w\u0026rdquo; command in the background.\n## Only PowerShell ISE ################################### if ($host.name -eq \u0026#39;Windows PowerShell ISE Host\u0026#39;) { Enable-ScriptBrowser } I\u0026rsquo;ve downloaded the ScriptBrowser plugin for ISE and I want it to run only in the PowerShell ISE console. Here you can place functions or commands that should only run in PowerShell ISE. Use \u0026ldquo;ConsoleHost\u0026rdquo; instead of \u0026ldquo;Windows PowerShell ISE Host\u0026rdquo; to define commands and functions to run only in the console.\n## Only PowerShell Console ################################### if ($host.name -eq \u0026#39;ConsoleHost\u0026#39;) { $a = (Get-Host).PrivateData $a.ErrorForegroundColor = \u0026#34;RED\u0026#34; $a.ErrorBackgroundColor = \u0026#34;WHITE\u0026#34; $a.WarningForegroundColor = \u0026#34;YELLOW\u0026#34; $a.WarningBackgroundColor = \u0026#34;BLACK\u0026#34; $a.DebugForegroundColor = \u0026#34;YELLOW\u0026#34; $a.DebugBackgroundColor = \u0026#34;BLACK\u0026#34; $a.VerboseForegroundColor = \u0026#34;YELLOW\u0026#34; $a.VerboseBackgroundColor = \u0026#34;BLACK\u0026#34; $a.ProgressForegroundColor = \u0026#34;YELLOW\u0026#34; $a.ProgressBackgroundColor = \u0026#34;DARKBLUE\u0026#34; $Shell = $Host.UI.RawUI $size = $Shell.BufferSize $size.width=150 $size.height=500 $Shell.BufferSize = $size $size = $Shell.WindowSize $size.width=150 $size.height=50 $Shell.WindowSize = $size } I use the above code to change the error colors in the PowerShell Console. This doesn\u0026rsquo;t work in PowerShell ISE, that\u0026rsquo;s why I used the if ($host.name) command.\n# Let the user know your done $ConsoleType = $Host.Name Write-Host -ForegroundColor \u0026#34;Green\u0026#34; \u0026#34;Base profile for $ConsoleType loaded.\u0026#34; Last but not least, let the user know that the Base profile is loaded for the ISE or Console. In the next blog post, I\u0026rsquo;ll show you some great functions I use with my PowerShell Profile. Cheers!\n","permalink":"https://devsecninja.com/2015/08/30/powershell-profiles-the-structure-of-your-_psh_base.ps1-file/","summary":"\u003cp\u003eIn my previous post \u003ca href=\"https://devsecninja.com/2015/08/30/powershell-profiles-the-profile-ps1-file/\"\u003ePowerShell Profiles - The profile.ps1 file\u003c/a\u003e I showed you my profile.ps1 file.\u003c/p\u003e\n\u003cp\u003eIn this post, I\u0026rsquo;ll show you a way to structure your base file, so that you can use it for your functions and aliases.\u003c/p\u003e\n\u003cp\u003eMake sure that you always use max 2 files.\u003c/p\u003e\n\u003cp\u003eThe first file is your profile.ps1 file and the other file is this _PSH_BASE.ps1 file.\u003c/p\u003e\n\u003cp\u003eIf you use like 3 or 4 files, it can take a couple of seconds to load your PowerShell session.\u003c/p\u003e","title":"PowerShell Profiles - The structure of your _PSH_BASE.ps1 file"},{"content":"Welcome to this blog series about PowerShell profiles. I\u0026rsquo;m using PowerShell profiles for a couple of months now to make life a lot easier.\nTo start this blog series, I would like to show you my Profile.ps1 file.\nIt\u0026rsquo;s located in \u0026ldquo;C:\\Users\\\\Documents\\WindowsPowerShell\u0026rdquo;.\nBecause I use my PowerShell profiles at multiple locations such as my work notebook, home computer and sometimes at projects, I need to make sure that my PowerShell script home is always right so that the rest of the PowerShell profile is able to load successfully.\nThat\u0026rsquo;s why I have these commands at the beginning of my profile:\n# Set the Home Directory for Powershell to start in $ScriptHomeOneDrive = \u0026#34;$Home\\OneDrive\\Scripts\u0026#34; $ScriptHomeDocuments = \u0026#34;$Home\\Documents\\Scripts\u0026#34; $ScriptHomeMyDocuments = \u0026#34;$Home\\My Documents\\Scripts\u0026#34; if ((Test-Path -Path $ScriptHomeOneDrive )) { $ENV:ScriptHome = $ScriptHomeOneDrive } Elseif ((Test-Path -Path $ScriptHomeDocuments )) { $ENV:ScriptHome = $ScriptHomeDocuments } Else { $ENV:ScriptHome = $ScriptHomeMyDocuments } Set-Location $ENV:ScriptHome Because I\u0026rsquo;m using OneDrive as my script repository, I need to make sure that the script start in my OneDrive\\Scripts directory. When I have access to a computer or server without OneDrive, I need to use the Documents folder as my script home. On some servers, the \u0026ldquo;Documents\u0026rdquo; folder is named as \u0026ldquo;My Documents\u0026rdquo;, so that\u0026rsquo;s why I use that folder too.\n# Load the base functions . .\\Startup\\_PSH_BASE.ps1 The above code will load my base file which is located in my OneDrive folder or in my Documents folder.\nNow I can use my OneDrive folder to sync all my scripts between multiple computers and make changes to the profile from every device.\nMake sure you sign those PowerShell Profile Scripts, so that you know that it\u0026rsquo;s not changed! In my next post, I\u0026rsquo;ll show the structure of my _PSH_BASE.ps1 file and other files that are part of my PowerShell profile.\n","permalink":"https://devsecninja.com/2015/08/30/powershell-profiles-the-profile.ps1-file/","summary":"\u003cp\u003eWelcome to this blog series about PowerShell profiles. I\u0026rsquo;m using PowerShell profiles for a couple of months now to make life a lot easier.\u003c/p\u003e\n\u003cp\u003eTo start this blog series, I would like to show you my Profile.ps1 file.\u003c/p\u003e\n\u003cp\u003eIt\u0026rsquo;s located in \u0026ldquo;C:\\Users\\\u003c!-- raw HTML omitted --\u003e\\Documents\\WindowsPowerShell\u0026rdquo;.\u003c/p\u003e\n\u003cp\u003eBecause I use my PowerShell profiles at multiple locations such as my work notebook, home computer and sometimes at projects, I need to make sure that my PowerShell script home is always right so that the rest of the PowerShell profile is able to load successfully.\u003c/p\u003e","title":"PowerShell Profiles - The profile.ps1 file"},{"content":"**Microsoft heeft tijdens het Microsoft Ignite evenement een nieuwe feature van Windows 10 getoond, genaamd Continuum.\nMet Continuum is het mogelijk om je telefoon of tablet in een desktop te veranderen.** Joe Belfiore (Corporate Vice President, Operating Systems Groupq @ Microsoft) liet vol trots deze nieuwste ontwikkeling zien.\nJe telefoon staat, in de meest ideale situatie via Bluetooth, in verbinding met toetsenbord en muis.\nDoor de telefoon op een beeldscherm aan te sluiten, schakel je automatisch van telefoonmodus naar een beperkte desktopomgeving.\nVoor zover te zien was, zijn alleen Modern Apps in deze omgeving te gebruiken.\nEen interessante ontwikkeling wanneer je voornamelijk de Office Suite als Word, Excel, PowerPoint en Outlook gebruikt, gezien deze (hoewel in een beperkte vorm) beschikbaar zijn als een Modern App.\nDit versterkt het motief voor bedrijven om Modern Apps te blijven ontwikkelen, omdat deze straks op verschillende Windows devices te gebruiken zijn.\nZo ontwikkel je een enkele app voor telefoon, tablet, laptop, desktop en zelfs voor de Xbox.\nAls documenten vervolgens overal beschikbaar zijn door OneDrive for Business, kan dit een ideaal werkpaard voor onderweg zijn.\nZodra je een toetsenbord of muis in plaats van op een telefoon, op een tablet aansluit, ervaar je Continuum op een andere manier.\nDe tabletmodus houdt in dat bijvoorbeeld je startscherm een volledig scherm wordt en dat alle apps op een volledig scherm worden getoond.\nJe wordt nu niet beperkt tot de Metro apps, waarbij dat met Continuum for Phone wel het geval is.\nToekomstperspectief Hoewel je niet de volledige desktop en Office ervaring hebt, kan dit een belangrijke ontwikkeling zijn voor mensen die zich veel op de weg bevinden.\nVeel mensen gebruiken alleen de basisfunctionaliteit van de Office Suite, wat een flinke besparing op Office licenties kan betekenen.\nGebruikers moeten wel rekening houden met een USB-hub om een toetsenbord, muis en monitor aan te kunnen sluiten, of moeten zelf Bluetooth apparatuur meenemen.\nAls dit een succes gaat worden, verwacht ik dat er in de toekomst Bluetooth toetsenborden en muizen op de markt komen die makkelijker instelbaar zijn voor flexplekken, zodat gebruikers niet iedere keer de combinatie opnieuw in hoeft te stellen.\nMocht een telefoon niet krachtig genoeg zijn, of wellicht te klein zijn, kan er altijd nog gekeken worden naar een 8\u0026quot; tablet.\nJoe Belfiore liet de werking op een Lenovo Thinkpad 8 zien.\nDeze modellen zijn handzaam en vaak aan te sluiten op een docking station.\nOnderweg maak je gebruik van de tabletmodus en op kantoor ervaar je de volledige desktop experience.\nIdeaal voor mensen die veel tussen verschillende kantoren reizen.\n","permalink":"https://devsecninja.com/2015/08/28/windows-continuum/","summary":"\u003cp\u003e**Microsoft heeft tijdens het Microsoft Ignite evenement een nieuwe feature van Windows 10 getoond, genaamd Continuum.\u003c/p\u003e\n\u003cp\u003eMet Continuum is het mogelijk om je telefoon of tablet in een desktop te veranderen.** Joe Belfiore (Corporate Vice President, Operating Systems Groupq @ Microsoft) liet vol trots deze nieuwste ontwikkeling zien.\u003c/p\u003e\n\u003cp\u003eJe telefoon staat, in de meest ideale situatie via Bluetooth, in verbinding met toetsenbord en muis.\u003c/p\u003e\n\u003cp\u003eDoor de telefoon op een beeldscherm aan te sluiten, schakel je automatisch van telefoonmodus naar een beperkte desktopomgeving.\u003c/p\u003e","title":"Windows Continuum"},{"content":"After installing Windows 10 on my production device (with a Windows 8.1 Hyper-V VM as back-up, of course), I had an issue with the HDMI Audio of my Dell Latitude E6540 notebook. I found some topics online about setting the default format of the Audio device, such as sample rate and bit depth, but that doesn\u0026rsquo;t work for me. I had to reconfigure my Receiver + 5.1 Audio as \u0026ldquo;5.1 Surround\u0026rdquo;, instead of the default \u0026ldquo;Stereo\u0026rdquo;.\nYou can find this option when you right click the speaker-icon in your taskbar and then click \u0026ldquo;Sound\u0026rdquo;.\nThe \u0026ldquo;Stereo\u0026rdquo; option worked for me before within Windows 8.1, but with Windows 10, I had to reconfigure this setting.\nAfter Windows Update installed the latest Intel Display Audio driver (6.16.0.3179), I had to configure the \u0026ldquo;5.1 Surround\u0026rdquo; again.\nLet me know if you had these issues too! [\n](/images/2015/07/soundmenu.png)\n","permalink":"https://devsecninja.com/2015/07/27/windows-10-intel-hdmi-audio-doesnt-work/","summary":"\u003cp\u003eAfter installing Windows 10 on my production device (with a Windows 8.1 Hyper-V VM as back-up, of course), I had an issue with the HDMI Audio of my Dell Latitude E6540 notebook. I found some topics online about setting the default format of the Audio device, such as sample rate and bit depth, but that doesn\u0026rsquo;t work for me. I had to reconfigure my Receiver + 5.1 Audio as \u0026ldquo;5.1 Surround\u0026rdquo;, instead of the default \u0026ldquo;Stereo\u0026rdquo;.\u003c/p\u003e","title":"Windows 10 - Intel HDMI Audio doesn't work"},{"content":"I had an issue within my lab with deploying Windows 8.1 drivers to Windows 10 with SCCM 2012 R2 SP1. It isn\u0026rsquo;t possible to make all Windows 8.1 drivers compatible with Windows 10 within the SCCM 2012 R2 SP1 console with just one click. Because I was running within a lab environment and I only had 2 driver packages for Windows 8.1 x64, I was able to make the drivers available for deployment to all platforms. You can do this with the magic of PowerShell:\nSolution Get-CMDriver | Set-CMDriver -RunOnAnyPlatform Note: make sure that all your drivers are compatible for all platforms. Make sure you have separate x64 and x86 Task Sequences and Driver Packages/Categories with drivers. Because I have separate Task Sequences/Driver Packages and categories for x64 and x86, it isn\u0026rsquo;t a problem for me to have the drivers supported on all platforms.\n","permalink":"https://devsecninja.com/2015/07/23/sccm-2012-r2-sp1-make-windows-8.1-drivers-supported-on-windows-10-with-powershell/","summary":"\u003cp\u003eI had an issue within my lab with deploying Windows 8.1 drivers to Windows 10 with SCCM 2012 R2 SP1. It isn\u0026rsquo;t possible to make all Windows 8.1 drivers compatible with Windows 10 within the SCCM 2012 R2 SP1 console with just one click. Because I was running within a lab environment and I only had 2 driver packages for Windows 8.1 x64, I was able to make the drivers available for deployment to all platforms. You can do this with the magic of PowerShell:\u003c/p\u003e","title":"SCCM 2012 R2 SP1 - Make Windows 8.1 drivers supported on Windows 10 with PowerShell"},{"content":"I was deploying Windows 10 with SCCM 2012 R2 SP1 and the task sequence failed after \u0026ldquo;Installing device drivers\u0026rdquo; with error code 0x80070032 (or 80070032). The \u0026ldquo;Auto Apply Drivers\u0026rdquo; task works fine, but doesn\u0026rsquo;t install a lot of drivers. The smsts.log file:\nDism failed with return code 50 Failed to add driver to driver store. Code 0x80070032 Failed to provision driver. Code 0x80070032 Exiting with return code 0x80070032\nSolution:\nMake sure you\u0026rsquo;re using the latest MDT version, compatible with Windows 10. Make sure you\u0026rsquo;re using at least ADK 10. Check if your boot image has OS Version 10 or higher. If not, create a new boot image with MDT or ADK. ","permalink":"https://devsecninja.com/2015/07/23/windows-10-sccm-2012-r2-sp1-fails-with-error-0x80070032/","summary":"\u003cp\u003eI was deploying Windows 10 with SCCM 2012 R2 SP1 and the task sequence failed after \u0026ldquo;Installing device drivers\u0026rdquo; with error code 0x80070032 (or 80070032). The \u0026ldquo;Auto Apply Drivers\u0026rdquo; task works fine, but doesn\u0026rsquo;t install a lot of drivers. \u003cstrong\u003eThe smsts.log file:\u003c/strong\u003e\u003c/p\u003e\n\u003cblockquote\u003e\n\u003cp\u003eDism failed with return code 50 Failed to add driver to driver store. Code 0x80070032 Failed to provision driver. Code 0x80070032 Exiting with return code 0x80070032\u003c/p\u003e\n\u003c/blockquote\u003e\n\u003cp\u003e\u003cstrong\u003eSolution:\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eMake sure you\u0026rsquo;re using the latest MDT version, compatible with Windows 10.\u003c/li\u003e\n\u003cli\u003eMake sure you\u0026rsquo;re using at least ADK 10.\u003c/li\u003e\n\u003cli\u003eCheck if your boot image has OS Version 10 or higher. If not, create a new boot image with MDT or ADK.\u003c/li\u003e\n\u003c/ul\u003e","title":"Windows 10 - SCCM 2012 R2 SP1 fails with error 0x80070032"},{"content":"I was doing a migration project from System Center - Operations Manager 2007 to System Center - Operations Manager 2012 R2.\nSome computers had troubles with the upgrade of the agent. I first tried to do a client push deployment. 80 % of the installations succeeded, but I had a couple of computers with failed installations/upgrades. 1.\nStart the agent installation manually on the failed computer.\nRun the MOMAgent.msi installer. 2.\nClick \u0026ldquo;Next\u0026rdquo;.\nAccept the terms and click \u0026ldquo;I agree\u0026rdquo;. **3.\nNotice the \u0026ldquo;Upgrade\u0026rdquo; button instead of the \u0026ldquo;Install\u0026rdquo; button.** 4.\nClose the setup and check the solution below.\nSolution: I was able to fix this to remove the product registry key of the SCOM 2007 Agent. You can find it at HKEY_CLASSES_ROOT\\Installer\\Products\\. Search for a key with ProductName \u0026ldquo;System Center Operations Manager 2007 R2 Agent\u0026rdquo;.\nBack-up your registry and remove the key. Try the setup again and notice the \u0026ldquo;Next\u0026rdquo; button instead of the \u0026ldquo;Upgrade\u0026rdquo; button. The agent push will work now.\n","permalink":"https://devsecninja.com/2015/05/01/manually-remove-scom-2012-or-scom-2007-r2-agent/","summary":"\u003cp\u003eI was doing a migration project from System Center - Operations Manager 2007 to System Center - Operations Manager 2012 R2.\u003c/p\u003e\n\u003cp\u003eSome computers had troubles with the upgrade of the agent. I first tried to do a client push deployment. 80 % of the installations succeeded, but I had a couple of computers with failed installations/upgrades. 1.\u003c/p\u003e\n\u003cp\u003eStart the agent installation manually on the failed computer.\u003c/p\u003e\n\u003cp\u003eRun the MOMAgent.msi installer. 2.\u003c/p\u003e","title":"Manually remove SCOM 2012 or SCOM 2007 R2 Agent"},{"content":"I had a small issue with my Remote Desktop Services Lab environment. I wanted to add a by my PKI infrastructure signed certificate to the Remote Desktop Roles. I created a certificate template like in this post.\nWhen I was importing the certificates into the wizard, the certificate looks fine because the state after selecting the certificate says \u0026ldquo;Success\u0026rdquo;.\nWhen you reopen the screen afterwards, it was like no certificate has been selected. I was able to see the certificate in my browser, so the selection was successful. I wasn\u0026rsquo;t able to find out what was the problem on the internet, but I fixed it to change the certificate template.\nIn the \u0026ldquo;Subject Name\u0026rdquo; tab, make sure that the Subject name format is \u0026ldquo;DNS name\u0026rdquo; and under \u0026ldquo;Include this information in alternate subject name\u0026rdquo;, \u0026ldquo;DNS name\u0026rdquo; is selected too.\nThat fixed it for me. [\n](/images/2015/02/certificatesuccess.png)\n","permalink":"https://devsecninja.com/2015/02/19/remote-desktop-services-certificate-state-success-but-level-is-not-configured/","summary":"\u003cp\u003eI had a small issue with my Remote Desktop Services Lab environment. I wanted to add a by my PKI infrastructure signed certificate to the Remote Desktop Roles. I created a certificate template like in \u003ca href=\"http://blogs.technet.com/b/askperf/archive/2014/01/24/certificate-requirements-for-windows-2008-r2-and-windows-2012-remote-desktop-services.aspx\" title=\"Blog post\"\u003ethis post\u003c/a\u003e.\u003c/p\u003e\n\u003cp\u003eWhen I was importing the certificates into the wizard, the certificate looks fine because the state after selecting the certificate says \u0026ldquo;Success\u0026rdquo;.\u003c/p\u003e\n\u003cp\u003e\u003cimg alt=\"CertificateError\" loading=\"lazy\" src=\"/images/2015/02/certificateerror.png\"\u003e When you reopen the screen afterwards, it was like no certificate has been selected. I was able to see the certificate in my browser, so the selection was successful. I wasn\u0026rsquo;t able to find out what was the problem on the internet, but I fixed it to change the certificate template.\u003c/p\u003e","title":"Remote Desktop Services - Certificate state 'success' but level is 'not configured'"},{"content":"I was looking for a way to deploy and automatically domain join a VM in Azure. The solution was quite simple: Azure Automation. I found the blog post of DexterPOSH very useful, but the script doesn\u0026rsquo;t work for me. Follow the steps on his blog and use this script below. I\u0026rsquo;ll update this post if I find some improvements.\nDon\u0026rsquo;t forget to update the domain in the Add-Computer part. To-Do list: - Custom static IP as variable.\nCustom domain as variable. workflow Deploy-Joined-VM { param( [parameter(Mandatory)] [String] $VMName, [parameter(Mandatory)] [String] $ServiceName = \u0026#34;VM-\u0026amp;amp;lt;Insert name\u0026amp;amp;gt;\u0026#34;, [parameter(Mandatory)] [String] $InstanceSize = \u0026#34;Small\u0026#34;, [parameter(Mandatory)] [String] $VMImageName = \u0026#34;Specify custom or default image name\u0026#34;, [parameter(Mandatory)] [String] $AzureSubscriptionName = \u0026#34;Subscription-1\u0026#34;, [parameter(Mandatory)] [String] $StorageAccountName = \u0026#34;contoso\u0026#34;, [parameter(Mandatory)] [String] $VMSubnetName = \u0026#34;subnet-1\u0026#34;, [parameter(Mandatory)] [String] $VMVnetName = \u0026#34;CORP.contoso.com\u0026#34;, [parameter(Mandatory)] [String] $VMAffinityGroup = \u0026#34;West-Europe\u0026#34; ) $verbosepreference = \u0026#39;continue\u0026#39; #Change this to your needs $DomainJoinAccount = \u0026#34;Domain Join Account\u0026#34; $LocalAccount = \u0026#34;LocalAdmin\u0026#34; $AutomationAccount = \u0026#34;Azure Automation Account\u0026#34; #Get the Credentials to authenticate agains Azure Write-Verbose -Message \u0026#34;Getting the Credentials\u0026#34; $Cred = Get-AutomationPSCredential -Name $AutomationAccount $LocalCred = Get-AutomationPSCredential -Name $LocalAccount $DomainCred = Get-AutomationPSCredential -Name $DomainJoinAccount #Add the Account to the Workflow Write-Verbose -Message \u0026#34;Adding the Azure Automation Account to Authenticate\u0026#34; Add-AzureAccount -Credential $Cred #select the Subscription Write-Verbose -Message \u0026#34;Selecting the $AzureSubscriptionName Subscription\u0026#34; Select-AzureSubscription -SubscriptionName $AzureSubscriptionName #Set the Storage for the Subscrption Write-Verbose -Message \u0026#34;Setting the Storage Account for the Subscription\u0026#34; Set-AzureSubscription -SubscriptionName $AzureSubscriptionName -CurrentStorageAccountName $StorageAccountName #Select the most recent Server 2012 R2 Image Write-Verbose -Message \u0026#34;Getting the Image details\u0026#34; $imagename = Get-AzureVMImage | where-object -filterscript { $_.ImageName -eq $VMImageName } | Sort-Object -Descending -Property PublishedDate | Select-Object -First 1 | select -ExpandProperty ImageName #use the above Image selected to build a new VM and wait for it to Boot $Username = $LocalCred.UserName $Password = $LocalCred.GetNetworkCredential().Password New-AzureQuickVM -Windows -ServiceName $ServiceName -Name $VMName -ImageName $imagename -Password $Password -AdminUsername $Username -SubnetNames $VMSubnetName -VNetName $VMVnetName -InstanceSize $InstanceSize -AffinityGroup $VMAffinityGroup -WaitForBoot Write-Verbose -Message \u0026#34;The VM is created and booted up now.. Doing a checkpoint\u0026#34; #CheckPoint the workflow CheckPoint-WorkFlow Write-Verbose -Message \u0026#34;Reached CheckPoint\u0026#34; #Call the Function Connect-VM to import the Certificate and give back the WinRM uri $WinRMURi = Get-AzureWinRMUri -ServiceName $ServiceName -Name $VMName | Select-Object -ExpandProperty AbsoluteUri InlineScript { do { #open a PSSession to the VM $Session = New-PSSession -ConnectionUri $Using:WinRMURi -Credential $Using:LocalCred -Name $using:VMName -SessionOption (New-PSSessionOption -SkipCACheck ) -ErrorAction SilentlyContinue Write-Verbose -Message \u0026#34;Trying to open a PSSession to the VM $Using:VMName \u0026#34; } While (! $Session) #Once the Session is opened, first step is to join the new VM to the domain if ($Session) { Write-Verbose -Message \u0026#34;Found a Session opened to VM $using:VMname. Now will try to add it to the domain\u0026#34; Invoke-command -Session $Session -ArgumentList $Using:DomainCred -ScriptBlock { param($cred) Add-Computer -DomainName \u0026#34;corp.contoso.com\u0026#34; -DomainCredential $cred Restart-Computer -Force } } } } #Workflow end ","permalink":"https://devsecninja.com/2015/02/18/azure-deploy-and-automatically-domain-join-a-vm-with-azure-automation-runbooks/","summary":"\u003cp\u003eI was looking for a way to deploy and automatically domain join a VM in Azure. The solution was quite simple: Azure Automation. I found \u003ca href=\"http://www.dexterposh.com/2014/10/azure-automation-deploy-domain-join-vm.html\" title=\"Deploy a Windows 10 VM (Server Tech Preview) \u0026amp; domain join\"\u003ethe blog post of DexterPOSH\u003c/a\u003e very useful, but the script doesn\u0026rsquo;t work for me. Follow the steps on his blog and use this script below. I\u0026rsquo;ll update this post if I find some improvements.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eDon\u0026rsquo;t forget to update the domain in the Add-Computer part.\u003c/strong\u003e To-Do list: - Custom static IP as variable.\u003c/p\u003e","title":"Azure - Deploy and automatically domain join a VM with Azure Automation Runbooks"},{"content":"I was looking for a way to automatically deploy a VM in Azure. The solution was quite simple: Azure Automation. I found the blog post of DexterPOSH very useful, but the script doesn\u0026rsquo;t work for me. Follow the steps on his blog and use this script below. I\u0026rsquo;ll update this post if I find some improvements.\nTo-Do list:\nCustom static IP as variable. Custom domain as variable. workflow Deploy-NonJoined-VM { param( [parameter(Mandatory)] [String] $VMName, [parameter(Mandatory)] [String] $ServiceName = \u0026#34;contoso\u0026lt;Insert name\u0026gt;\u0026#34;, [parameter(Mandatory)] [String] $InstanceSize = \u0026#34;Small\u0026#34;, [parameter(Mandatory)] [String] $VMImageName = \u0026#34;Specify custom or default image name\u0026#34;, [parameter(Mandatory)] [String] $AzureSubscriptionName = \u0026#34;Subscription-1\u0026#34;, [parameter(Mandatory)] [String] $StorageAccountName = \u0026#34;contoso\u0026#34;, [parameter(Mandatory)] [String] $VMSubnetName = \u0026#34;subnet-1\u0026#34;, [parameter(Mandatory)] [String] $VMVnetName = \u0026#34;CORP.contoso.com\u0026#34;, [parameter(Mandatory)] [String] $VMAffinityGroup = \u0026#34;West-Europe\u0026#34; ) $verbosepreference = \u0026#39;continue\u0026#39; #Change this to your needs $DomainJoinAccount = \u0026#34;Domain Join Account\u0026#34; $LocalAccount = \u0026#34;LocalAdmin\u0026#34; $AutomationAccount = \u0026#34;Azure Automation Account\u0026#34; #Get the Credentials to authenticate agains Azure Write-Verbose -Message \u0026#34;Getting the Credentials\u0026#34; $Cred = Get-AutomationPSCredential -Name $AutomationAccount $LocalCred = Get-AutomationPSCredential -Name $LocalAccount $DomainCred = Get-AutomationPSCredential -Name $DomainJoinAccount #Add the Account to the Workflow Write-Verbose -Message \u0026#34;Adding the Azure Automation Account to Authenticate\u0026#34; Add-AzureAccount -Credential $Cred #select the Subscription Write-Verbose -Message \u0026#34;Selecting the $AzureSubscriptionName Subscription\u0026#34; Select-AzureSubscription -SubscriptionName $AzureSubscriptionName #Set the Storage for the Subscrption Write-Verbose -Message \u0026#34;Setting the Storage Account for the Subscription\u0026#34; Set-AzureSubscription -SubscriptionName $AzureSubscriptionName -CurrentStorageAccountName $StorageAccountName #Select the most recent Server 2012 R2 Image Write-Verbose -Message \u0026#34;Getting the Image details\u0026#34; $imagename = Get-AzureVMImage | where-object -filterscript { $_.ImageName -eq $VMImageName } | Sort-Object -Descending -Property PublishedDate | Select-Object -First 1 | select -ExpandProperty ImageName #use the above Image selected to build a new VM and wait for it to Boot $Username = $LocalCred.UserName $Password = $LocalCred.GetNetworkCredential().Password New-AzureQuickVM -Windows -ServiceName $ServiceName -Name $VMName -ImageName $imagename -Password $Password -AdminUsername $Username -SubnetNames $VMSubnetName -VNetName $VMVnetName -InstanceSize $InstanceSize -AffinityGroup $VMAffinityGroup -WaitForBoot Write-Verbose -Message \u0026#34;The VM is created and booted up now.. Deployment done.\u0026#34; } #Workflow end ","permalink":"https://devsecninja.com/2015/02/18/azure-automatically-deploy-a-vm-in-azure-runbook/","summary":"\u003cp\u003eI was looking for a way to automatically deploy a VM in Azure. The solution was quite simple: Azure Automation. I found \u003ca href=\"http://www.dexterposh.com/2014/10/azure-automation-deploy-domain-join-vm.html\" title=\"Deploy a Windows 10 VM (Server Tech Preview) \u0026amp; domain join\"\u003ethe blog post of DexterPOSH\u003c/a\u003e very useful, but the script doesn\u0026rsquo;t work for me. Follow the steps on his blog and use this script below. I\u0026rsquo;ll update this post if I find some improvements.\u003c/p\u003e\n\u003cp\u003eTo-Do list:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eCustom static IP as variable.\u003c/li\u003e\n\u003cli\u003eCustom domain as variable.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" class=\"chroma\"\u003e\u003ccode class=\"language-powershell\" data-lang=\"powershell\"\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"kd\"\u003eworkflow\u003c/span\u003e\u003cspan class=\"w\"\u003e \u003c/span\u003e\u003cspan class=\"nb\"\u003eDeploy-NonJoined-VM\u003c/span\u003e \u003cspan class=\"p\"\u003e{\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"k\"\u003eparam\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"nb\"\u003eparameter\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"na\"\u003eMandatory\u003c/span\u003e\u003cspan class=\"p\"\u003e)]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"no\"\u003eString\u003c/span\u003e\u003cspan class=\"p\"\u003e]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"nv\"\u003e$VMName\u003c/span\u003e\u003cspan class=\"p\"\u003e,\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"nb\"\u003eparameter\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"na\"\u003eMandatory\u003c/span\u003e\u003cspan class=\"p\"\u003e)]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"no\"\u003eString\u003c/span\u003e\u003cspan class=\"p\"\u003e]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"nv\"\u003e$ServiceName\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;contoso\u0026lt;Insert name\u0026gt;\u0026#34;\u003c/span\u003e\u003cspan class=\"p\"\u003e,\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"nb\"\u003eparameter\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"na\"\u003eMandatory\u003c/span\u003e\u003cspan class=\"p\"\u003e)]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"no\"\u003eString\u003c/span\u003e\u003cspan class=\"p\"\u003e]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"nv\"\u003e$InstanceSize\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Small\u0026#34;\u003c/span\u003e\u003cspan class=\"p\"\u003e,\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"nb\"\u003eparameter\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"na\"\u003eMandatory\u003c/span\u003e\u003cspan class=\"p\"\u003e)]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"no\"\u003eString\u003c/span\u003e\u003cspan class=\"p\"\u003e]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"nv\"\u003e$VMImageName\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Specify custom or default image name\u0026#34;\u003c/span\u003e\u003cspan class=\"p\"\u003e,\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"nb\"\u003eparameter\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"na\"\u003eMandatory\u003c/span\u003e\u003cspan class=\"p\"\u003e)]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"no\"\u003eString\u003c/span\u003e\u003cspan class=\"p\"\u003e]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"nv\"\u003e$AzureSubscriptionName\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Subscription-1\u0026#34;\u003c/span\u003e\u003cspan class=\"p\"\u003e,\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"nb\"\u003eparameter\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"na\"\u003eMandatory\u003c/span\u003e\u003cspan class=\"p\"\u003e)]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"no\"\u003eString\u003c/span\u003e\u003cspan class=\"p\"\u003e]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"nv\"\u003e$StorageAccountName\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;contoso\u0026#34;\u003c/span\u003e\u003cspan class=\"p\"\u003e,\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"nb\"\u003eparameter\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"na\"\u003eMandatory\u003c/span\u003e\u003cspan class=\"p\"\u003e)]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"no\"\u003eString\u003c/span\u003e\u003cspan class=\"p\"\u003e]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"nv\"\u003e$VMSubnetName\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;subnet-1\u0026#34;\u003c/span\u003e\u003cspan class=\"p\"\u003e,\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"nb\"\u003eparameter\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"na\"\u003eMandatory\u003c/span\u003e\u003cspan class=\"p\"\u003e)]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"no\"\u003eString\u003c/span\u003e\u003cspan class=\"p\"\u003e]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"nv\"\u003e$VMVnetName\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;CORP.contoso.com\u0026#34;\u003c/span\u003e\u003cspan class=\"p\"\u003e,\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"nb\"\u003eparameter\u003c/span\u003e\u003cspan class=\"p\"\u003e(\u003c/span\u003e\u003cspan class=\"na\"\u003eMandatory\u003c/span\u003e\u003cspan class=\"p\"\u003e)]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"p\"\u003e[\u003c/span\u003e\u003cspan class=\"no\"\u003eString\u003c/span\u003e\u003cspan class=\"p\"\u003e]\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e        \u003cspan class=\"nv\"\u003e$VMAffinityGroup\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;West-Europe\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"p\"\u003e)\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$verbosepreference\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s1\"\u003e\u0026#39;continue\u0026#39;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"c\"\u003e#Change this to your needs\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$DomainJoinAccount\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Domain Join Account\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$LocalAccount\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;LocalAdmin\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$AutomationAccount\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Azure Automation Account\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"c\"\u003e#Get the Credentials to authenticate agains Azure\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eWrite-Verbose\u003c/span\u003e \u003cspan class=\"n\"\u003e-Message\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Getting the Credentials\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$Cred\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"nb\"\u003eGet-AutomationPSCredential\u003c/span\u003e \u003cspan class=\"n\"\u003e-Name\u003c/span\u003e \u003cspan class=\"nv\"\u003e$AutomationAccount\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$LocalCred\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"nb\"\u003eGet-AutomationPSCredential\u003c/span\u003e \u003cspan class=\"n\"\u003e-Name\u003c/span\u003e \u003cspan class=\"nv\"\u003e$LocalAccount\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$DomainCred\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"nb\"\u003eGet-AutomationPSCredential\u003c/span\u003e \u003cspan class=\"n\"\u003e-Name\u003c/span\u003e \u003cspan class=\"nv\"\u003e$DomainJoinAccount\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"c\"\u003e#Add the Account to the Workflow\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eWrite-Verbose\u003c/span\u003e \u003cspan class=\"n\"\u003e-Message\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Adding the Azure Automation Account to Authenticate\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eAdd-AzureAccount\u003c/span\u003e \u003cspan class=\"n\"\u003e-Credential\u003c/span\u003e \u003cspan class=\"nv\"\u003e$Cred\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"c\"\u003e#select the Subscription\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eWrite-Verbose\u003c/span\u003e \u003cspan class=\"n\"\u003e-Message\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Selecting the \u003c/span\u003e\u003cspan class=\"nv\"\u003e$AzureSubscriptionName\u003c/span\u003e\u003cspan class=\"s2\"\u003e Subscription\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eSelect-AzureSubscription\u003c/span\u003e \u003cspan class=\"n\"\u003e-SubscriptionName\u003c/span\u003e \u003cspan class=\"nv\"\u003e$AzureSubscriptionName\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"c\"\u003e#Set the Storage for the Subscrption\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eWrite-Verbose\u003c/span\u003e \u003cspan class=\"n\"\u003e-Message\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;Setting the Storage Account for the Subscription\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eSet-AzureSubscription\u003c/span\u003e \u003cspan class=\"n\"\u003e-SubscriptionName\u003c/span\u003e \u003cspan class=\"nv\"\u003e$AzureSubscriptionName\u003c/span\u003e \u003cspan class=\"n\"\u003e-CurrentStorageAccountName\u003c/span\u003e \u003cspan class=\"nv\"\u003e$StorageAccountName\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"c\"\u003e#Select the most recent Server 2012 R2 Image\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eWrite-Verbose\u003c/span\u003e \u003cspan class=\"n\"\u003e-Message\u003c/span\u003e  \u003cspan class=\"s2\"\u003e\u0026#34;Getting the Image details\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$imagename\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"nb\"\u003eGet-AzureVMImage\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003ewhere-object\u003c/span\u003e \u003cspan class=\"n\"\u003e-filterscript\u003c/span\u003e \u003cspan class=\"p\"\u003e{\u003c/span\u003e \u003cspan class=\"nv\"\u003e$_\u003c/span\u003e\u003cspan class=\"p\"\u003e.\u003c/span\u003e\u003cspan class=\"py\"\u003eImageName\u003c/span\u003e \u003cspan class=\"o\"\u003e-eq\u003c/span\u003e \u003cspan class=\"nv\"\u003e$VMImageName\u003c/span\u003e \u003cspan class=\"p\"\u003e}\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eSort-Object\u003c/span\u003e \u003cspan class=\"n\"\u003e-Descending\u003c/span\u003e \u003cspan class=\"n\"\u003e-Property\u003c/span\u003e \u003cspan class=\"n\"\u003ePublishedDate\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eSelect-Object\u003c/span\u003e \u003cspan class=\"n\"\u003e-First\u003c/span\u003e \u003cspan class=\"mf\"\u003e1\u003c/span\u003e \u003cspan class=\"p\"\u003e|\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eselect \u003c/span\u003e\u003cspan class=\"n\"\u003e-ExpandProperty\u003c/span\u003e \u003cspan class=\"n\"\u003eImageName\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"c\"\u003e#use the above Image selected to build a new VM and wait for it to Boot\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$Username\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"nv\"\u003e$LocalCred\u003c/span\u003e\u003cspan class=\"p\"\u003e.\u003c/span\u003e\u003cspan class=\"py\"\u003eUserName\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nv\"\u003e$Password\u003c/span\u003e \u003cspan class=\"p\"\u003e=\u003c/span\u003e \u003cspan class=\"nv\"\u003e$LocalCred\u003c/span\u003e\u003cspan class=\"p\"\u003e.\u003c/span\u003e\u003cspan class=\"py\"\u003eGetNetworkCredential\u003c/span\u003e\u003cspan class=\"p\"\u003e().\u003c/span\u003e\u003cspan class=\"py\"\u003ePassword\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eNew-AzureQuickVM\u003c/span\u003e \u003cspan class=\"n\"\u003e-Windows\u003c/span\u003e \u003cspan class=\"n\"\u003e-ServiceName\u003c/span\u003e \u003cspan class=\"nv\"\u003e$ServiceName\u003c/span\u003e \u003cspan class=\"n\"\u003e-Name\u003c/span\u003e \u003cspan class=\"nv\"\u003e$VMName\u003c/span\u003e \u003cspan class=\"n\"\u003e-ImageName\u003c/span\u003e \u003cspan class=\"nv\"\u003e$imagename\u003c/span\u003e \u003cspan class=\"n\"\u003e-Password\u003c/span\u003e \u003cspan class=\"nv\"\u003e$Password\u003c/span\u003e \u003cspan class=\"n\"\u003e-AdminUsername\u003c/span\u003e \u003cspan class=\"nv\"\u003e$Username\u003c/span\u003e \u003cspan class=\"n\"\u003e-SubnetNames\u003c/span\u003e \u003cspan class=\"nv\"\u003e$VMSubnetName\u003c/span\u003e \u003cspan class=\"n\"\u003e-VNetName\u003c/span\u003e \u003cspan class=\"nv\"\u003e$VMVnetName\u003c/span\u003e \u003cspan class=\"n\"\u003e-InstanceSize\u003c/span\u003e \u003cspan class=\"nv\"\u003e$InstanceSize\u003c/span\u003e \u003cspan class=\"n\"\u003e-AffinityGroup\u003c/span\u003e \u003cspan class=\"nv\"\u003e$VMAffinityGroup\u003c/span\u003e \u003cspan class=\"n\"\u003e-WaitForBoot\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e    \u003cspan class=\"nb\"\u003eWrite-Verbose\u003c/span\u003e \u003cspan class=\"n\"\u003e-Message\u003c/span\u003e \u003cspan class=\"s2\"\u003e\u0026#34;The VM is created and booted up now.. Deployment done.\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan class=\"line\"\u003e\u003cspan class=\"cl\"\u003e\u003cspan class=\"p\"\u003e}\u003c/span\u003e \u003cspan class=\"c\"\u003e#Workflow end\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e","title":"Azure - Automatically deploy a VM in Azure (Runbook)"},{"content":"I have an Intel NUC with 8 GB as media center and Hyper-V host for a test domain controller. I attached 2 physical external disks (1x 2,5\u0026quot; USB 3, 1x 3,5\u0026quot; USB 2) and created a storage pool. A couple of weeks later, I bought a new 2,5\u0026quot; USB 3 disk with 1 TB to replace the older USB 2.0 disk.\nThat disk is a lot quieter than the older one.\nFirst, attach the new disk and make sure it\u0026rsquo;s added to your storage pool.\nYou can\u0026rsquo;t remove the physical disk from the console when it\u0026rsquo;s in use.\nThe only option is to rename a disk:[\n](/images/2015/01/remove-physicaldisk-1.png)\nRemove physical disk from the storage pool with PowerShell Open PowerShell (as Administrator) and paste the following code:\nGet-PhysicalDisk\nAs you can see, my drives are in a healthy state and operational. We need the FriendlyName of the device. You can use -AutoSize after ft or just use FriendlyName:\nGet-PhysicalDisk | ft FriendlyName\nCopy the FriendlyName of the Physical Drive you want to remove from the storage pool and use the following code:\nSet-PhysicalDisk -FriendlyName \u0026ldquo;\u0026rdquo; -Usage Retired\nWe have to repair the Virtual Disks on the drives, because some Virtual Disks are maybe located on the retired drive. Firstly, make sure you know the name of the Virtual Disk:\nGet-VirtualDisk\nIf your Virtual Disk names are too long for the output, you can use\nGet-VirtualDisk | ft -AutoSize\nCopy the FriendlyName of the first Virtual Disk and type:\nRepair-VirtualDisk -FriendlyName \u0026ldquo;YourVirtualDisk\u0026rdquo;\nDo this for every Virtual Disk in the storage pool. Some repairs are quick, because the Virtual Disk was not located on the drive or was very small. Open a new PowerShell window to monitor the repairs with\nGet-StorageJob\nLast but not least: remove the PhysicalDisk if all repairs are successfully done. Make sure that your drive is in the output of the following command:\nGet-PhysicalDisk | Where-Object { $_.Usage -eq \u0026lsquo;Retired\u0026rsquo;}\nAssign the disk to a variable:\n$DiskToRemove = Get-PhysicalDisk | Where-Object { $_.Usage -eq \u0026lsquo;Retired\u0026rsquo;}\nFind the name of the storage pool:\nGet-StoragePool\nDelete the physical disk from the storage pool:\nRemove-PhysicalDisk -PhysicalDisks $DiskToRemove -StoragePoolFriendlyName \u0026ldquo;Storage pool\u0026rdquo;\nThis should work! Your drive should now be deleted.\nTroubleshooting Remove-PhysicalDisk: One of the physical disks specified could not be removed because it is still in use.\n[\n](/images/2015/01/remove-physicaldisk-12.png) If the Remove-PhysicalDisk command doesn\u0026rsquo;t work, try to remove the disk when the device is turned off, so you can see which virtual disk is using the physical drive. Turn the device off and attach the drive. I had a simple storage space on the old hard disk. I had to delete that storage space manually before the command above worked.\n","permalink":"https://devsecninja.com/2015/01/02/windows-storage-spaces-remove-physical-disk-from-storage-pool-with-powershell/","summary":"\u003cp\u003eI have an Intel NUC with 8 GB as media center and Hyper-V host for a test domain controller. I attached 2 physical external disks (1x 2,5\u0026quot; USB 3, 1x 3,5\u0026quot; USB 2) and created a storage pool. A couple of weeks later, I bought a new 2,5\u0026quot; USB 3 disk with 1 TB to replace the older USB 2.0 disk.\u003c/p\u003e\n\u003cp\u003eThat disk is a lot quieter than the older one.\u003c/p\u003e","title":"Windows Storage Spaces - Remove physical disk from storage pool with PowerShell"},{"content":"Problem The following error occurs in the ccm.log when doing a remote client install:\nUnable to connect to WMI (root\\ccm) on remote machine \u0026ldquo;COMPUTER\u0026rdquo;\nSolution Allow the following rules in Windows Firewall: Outbound and inbound: File and Printer Sharing Inbound: Windows Management Instrumentation (WMI)\n","permalink":"https://devsecninja.com/2014/12/28/sccm-unable-to-connect-to-wmi-root%5Cccm-on-remote-machine/","summary":"\u003ch3 id=\"problem\"\u003eProblem\u003c/h3\u003e\n\u003cp\u003eThe following error occurs in the ccm.log when doing a remote client install:\u003c/p\u003e\n\u003cblockquote\u003e\n\u003cp\u003eUnable to connect to WMI (root\\ccm) on remote machine \u0026ldquo;COMPUTER\u0026rdquo;\u003c/p\u003e\n\u003c/blockquote\u003e\n\u003ch3 id=\"solution\"\u003eSolution\u003c/h3\u003e\n\u003cp\u003eAllow the following rules in Windows Firewall: \u003cstrong\u003eOutbound and inbound: File and Printer Sharing\u003c/strong\u003e \u003cstrong\u003eInbound: Windows Management Instrumentation (WMI)\u003c/strong\u003e\u003c/p\u003e","title":"SCCM - Unable to connect to WMI (root\\ccm) on remote machine"},{"content":"I was training for my Citrix XenApp 6.5 certification. When I was running the \u0026ldquo;Configure and discovery\u0026rdquo; wizard in the Citrix XenApp application, I received the error below.\nI presume that SSO is already installed. Make sure you login as a Domain Admin on your domain controller. (Very important)\nCancel the wizard and start the XenApp 6.5 CD on your domain controller. Browse to Manually install components -\u0026gt; Server Components -\u0026gt; Additional Features -\u0026gt; Single Sign-On -\u0026gt; Central Store -\u0026gt; Active Directory -\u0026gt; Step 1: Extend Active Directory. After completing the first step, run Step 2: Create Central Store\nClose and reopen the Citrix AppCenter on your XenApp server and try the discovery again.\nGood luck!\n","permalink":"https://devsecninja.com/2014/08/09/citrix-xenapp-6.5-the-central-store-inspection-failed-single-sign-on/","summary":"\u003cp\u003eI was training for my Citrix XenApp 6.5 certification. When I was running the \u0026ldquo;Configure and discovery\u0026rdquo; wizard in the Citrix XenApp application, I received the error below.\u003c/p\u003e\n\u003cp\u003eI presume that SSO is already installed. Make sure you login as a Domain Admin on your domain controller. (Very important)\u003c/p\u003e\n\u003cp\u003e\u003ca href=\"https://cloudenius.com/images/2014/12/30-07-2014_07-41-47-8i9x5s9g8q972p6r60-1025x695.png\"\u003e\u003cimg alt=\"30-07-2014_07-41-47-8I9X5S9G8Q972P6r60-1025x695\" loading=\"lazy\" src=\"https://cloudenius.com/images/2014/12/30-07-2014_07-41-47-8i9x5s9g8q972p6r60-1025x695.png\"\u003e\u003c/a\u003e\u003c/p\u003e\n\u003cp\u003eCancel the wizard and start the XenApp 6.5 CD on your domain controller. Browse to Manually install components -\u0026gt; Server Components -\u0026gt; Additional Features -\u0026gt; Single Sign-On -\u0026gt; Central Store -\u0026gt; Active Directory -\u0026gt; Step 1: Extend Active Directory. After completing the first step, run Step 2: Create Central Store\u003c/p\u003e","title":"Citrix XenApp 6.5: the central store inspection failed (Single Sign-On)"},{"content":"I updated my Dell Latitude E6540 BIOS to A10 today. After updating, the screen stays black after rebooting. If you reboot your PC, your resolution has changed and the drivers are not working anymore. Opening the AMD Catalyst Control Center gives the error “AMD drivers are not functioning properly”.\nSolution: go to the Dell website. Click the Support page and type your Service Tag. Download the newest AMD and Intel graphics drivers. I installed the AMD driver after installing the Intel driver. You don’t have to delete the old driver. Reboot after installing the 2 drivers. Your notebook should work as before.\n","permalink":"https://devsecninja.com/2014/07/21/amd-drivers-are-not-functioning-properly-after-updating-dell-latitude-e6540-bios-to-a8/a10/","summary":"\u003cp\u003eI updated my Dell Latitude E6540 BIOS to A10 today. After updating, the screen stays black after rebooting. If you reboot your PC, your resolution has changed and the drivers are not working anymore. Opening the AMD Catalyst Control Center gives the error “AMD drivers are not functioning properly”.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSolution:\u003c/strong\u003e go to the Dell website. Click the Support page and type your Service Tag. Download the newest AMD and Intel graphics drivers. I installed the AMD driver after installing the Intel driver. You don’t have to delete the old driver. Reboot after installing the 2 drivers. Your notebook should work as before.\u003c/p\u003e","title":"AMD drivers are not functioning properly after updating Dell Latitude E6540 BIOS to A8/A10"},{"content":"When I was installing Exchange 2013 in my home lab, I received an error message on 98 %. You will see the screen below for 15-20 minutes. [\n](https://cloudenius.com/images/2014/12/exchange-2013-service-fms-failed-to-reach-status-running-error-702x576.jpg) After a moment your installation ends up in an error:\nError: The following error was generated when \u0026ldquo;$error.Clear(); if ($RoleStartTransportService) { start-SetupService -ServiceName FMS } \u0026rdquo; was run: \u0026ldquo;Service \u0026lsquo;FMS\u0026rsquo; failed to reach status \u0026lsquo;Running\u0026rsquo; on this server.\u0026rdquo;.\nWhen I was searching on Google I found some topics who say that FMS stands for Filtering Management Service. Some people say it\u0026rsquo;s because you\u0026rsquo;ve enabled IPv6 or it\u0026rsquo;s about your write permissions on the harddisk. I tried a couple of things without luck. Because I\u0026rsquo;m using the Exchange server is running in my home lab, I don\u0026rsquo;t need to install the malware protection.\nThe solution Everytime I started the installation, Exchange restarts the incomplete installation. You first need to uninstall the incomplete installation from your machine. Go to \u0026ldquo;Programs and Features\u0026rdquo; and delete Exchange from there. Then restart your machine and try install Exchange again but untick the box which installs the malware protection. (Of course, only recommended in a testing environment.)\n","permalink":"https://devsecninja.com/2014/01/05/exchange-2013-service-fms-failed-to-reach-status-running-on-this-server./","summary":"\u003cp\u003eWhen I was installing Exchange 2013 in my home lab, I received an error message on 98 %. You will see the screen below for 15-20 minutes. [\u003c/p\u003e\n\u003cp\u003e\u003cimg alt=\"Exchange-2013-Service-FMS-failed-to-reach-status-Running-error-702x576\" loading=\"lazy\" src=\"https://cloudenius.com/images/2014/12/exchange-2013-service-fms-failed-to-reach-status-running-error-702x576.jpg\"\u003e](\u003ca href=\"https://cloudenius.com/images/2014/12/exchange-2013-service-fms-failed-to-reach-status-running-error-702x576.jpg\"\u003ehttps://cloudenius.com/images/2014/12/exchange-2013-service-fms-failed-to-reach-status-running-error-702x576.jpg\u003c/a\u003e) After a moment your installation ends up in an error:\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eError:\u003c/strong\u003e \u003cstrong\u003eThe following error was generated when \u0026ldquo;$error.Clear();\u003c/strong\u003e \u003cstrong\u003eif ($RoleStartTransportService)\u003c/strong\u003e \u003cstrong\u003e{\u003c/strong\u003e \u003cstrong\u003estart-SetupService -ServiceName FMS\u003c/strong\u003e \u003cstrong\u003e}\u003c/strong\u003e \u003cstrong\u003e\u0026rdquo; was run: \u0026ldquo;Service \u0026lsquo;FMS\u0026rsquo; failed to reach status \u0026lsquo;Running\u0026rsquo; on this server.\u0026rdquo;.\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003eWhen I was searching on Google I found some topics who say that FMS stands for Filtering Management Service. Some people say it\u0026rsquo;s because you\u0026rsquo;ve enabled IPv6 or it\u0026rsquo;s about your write permissions on the harddisk. I tried a couple of things without luck. Because I\u0026rsquo;m using the Exchange server is running in my home lab, I don\u0026rsquo;t need to install the malware protection.\u003c/p\u003e","title":"Exchange 2013 - Service FMS failed to reach status 'Running' on this server."},{"content":"A customer asked me to define the customer\u0026rsquo;s email address in the order confirmation emails, sent out by Magento. You will have to edit or create a new template and add this shortcode:``` {{htmlescape var=$order.getCustomerEmail()}}\n","permalink":"https://devsecninja.com/2013/12/01/magento-inserting-the-customers-email-address-into-the-email-templates/","summary":"\u003cp\u003eA customer asked me to define the customer\u0026rsquo;s email address in the order confirmation emails, sent out by Magento. You will have to edit or create a new template and add this shortcode:```\n{{htmlescape var=$order.getCustomerEmail()}}\u003c/p\u003e\n\u003cpre tabindex=\"0\"\u003e\u003ccode class=\"language-This\" data-lang=\"This\"\u003e\u003c/code\u003e\u003c/pre\u003e","title":"Magento – Inserting the Customer’s Email Address into the Email Templates"},{"content":"**I was installing a Cisco Catalyst Express 500 switch, but I failed a couple of times to enter the setup menu.\nWhen you try to restart the switch to factory default, you should get an IP address from the switch.\nMostly this is 10.1.1.2.\nIt works only with my old Windows XP laptop, not with my Windows 8.1 machines.** To fix this problem, set your IP address to: IP address: 10.1.1.2 Subnetmask: 255.255.255.0 Default gateway: 10.1.1.1 Try again to reset your switch to factory settings and you will now have access to the website of the switch.\nGood luck!\nIf you have any questions, let me know!\n","permalink":"https://devsecninja.com/2013/10/17/cisco-catalyst-express-500-poe-switch-not-working-with-windows-vista/7/8/8.1/","summary":"\u003cp\u003e**I was installing a Cisco Catalyst Express 500 switch, but I failed a couple of times to enter the setup menu.\u003c/p\u003e\n\u003cp\u003eWhen you try to restart the switch to factory default, you should get an IP address from the switch.\u003c/p\u003e\n\u003cp\u003eMostly this is 10.1.1.2.\u003c/p\u003e\n\u003cp\u003eIt works only with my old Windows XP laptop, not with my Windows 8.1 machines.** To fix this problem, set your IP address to: IP address: 10.1.1.2 Subnetmask: 255.255.255.0 Default gateway: 10.1.1.1 Try again to reset your switch to factory settings and you will now have access to the website of the switch.\u003c/p\u003e","title":"Cisco - Catalyst Express 500 PoE switch not working with Windows Vista/7/8/8.1"},{"content":"Jean-Paul van Ravensberg works for Microsoft as a Cloud Solution Architect. He enjoys the dynamics of interacting with clients to shape a solution, and the focus on delivering powerful security \u0026amp; automation solutions. Additionally, he is eager to learn about new technologies \u0026amp; code languages. This often results in new certifications \u0026amp; publications on this blog. He likes to automate day-to-day tasks with Home Automation in Python, PowerShell and by leveraging the Azure Cloud Platform. Prior to joining Microsoft, Jean-Paul worked for Avanade as a Sr. Consultant. Follow Jean-Paul on LinkedIn.\n","permalink":"https://devsecninja.com/about/","summary":"\u003cp\u003eJean-Paul van Ravensberg works for Microsoft as a Cloud Solution Architect. He enjoys the dynamics of interacting with clients to shape a solution, and the focus on delivering powerful security \u0026amp; automation solutions. Additionally, he is eager to learn about new technologies \u0026amp; code languages. This often results in new certifications \u0026amp; publications on this blog. He likes to automate day-to-day tasks with Home Automation in Python, PowerShell and by leveraging the Azure Cloud Platform. Prior to joining Microsoft, Jean-Paul worked for Avanade as a Sr. Consultant. Follow Jean-Paul on \u003ca href=\"https://linkedin.com/in/jvravensberg\"\u003eLinkedIn\u003c/a\u003e.\u003c/p\u003e","title":"About the author"}]