Right, I keep having to do this, and keep having to look this up, so here it is.
If you want to do a web transform for an assembly binding redirect it can be a bit tricky. The assembly details are in an
<assemblyIdentity /> element, and the
<bindingRedirect /> is its sibling. Yeah, I don’t know why it was designed this way; I’m assuming alcohol was involved. Yes, having the oldVersion and newVersion attributes in the same element at the assembly’s identity would be much simpler.
Anyway, it is what it is. An alternative is to replace the entire
<dependentAssembly /> element, but the locator becomes a bit more fiddly. Still, it works. See this example – the locator on the parent element is checking the name of the child assemblyIdentity.
<dependentAssembly xdt:Transform="Replace" xdt:Locator="Condition(./_defaultNamespace:assemblyIdentity/@name='System.Runtime')">
<assemblyIdentity name="System.Runtime" publicKeyToken="b03f5f7f11d50a3a" />
<bindingRedirect oldVersion="0.0.0.0-18.104.22.168" newVersion="22.214.171.124" />
So, we’re all familiar with the Core Web Vitals scores, right? Just in case we’re not, lets recap. These are scores that Google uses to just the speed of your page. It’s more than just “How much data” does the page use, or “When is the DOM ready”, but also includes things like “When is the content painted on the page” and “How much does stuff shift as the page is loaded”.
This is actually really good, as these are much better metrics; people want your page to read, and these metrics focus on that.
So, one of the things I’ve noticed is that a number of our customers are not getting great scores for the Largest Contentful Paint (LCP) – and this seems to be due to a particular bugbear of mine – carousels.
Continue reading “Carousels are killing our Core Web Vitals scores”
One of my colleagues asked a question that I’ve heard dozens of times over my career…
What’s a good regular expression for validating email addresses.
Sadly, due to poor standards, poor implementation choices, and just the sheer age of Email, this is a surprisingly tough problem.
The best way to validate an email address is to email it, and get the user to do something. That’s not really feasible if it’s someone just filling in a form.
Failing that, I came up with:
This is the same as suggested by Microsoft, which is gratifying. You can see the logic of it here.
I noticed something strange in Sitecore; for most of my nodes (not the Sitecore node!), the 13th Hex character of the identifying GUID is ‘4’.
I had a list of about 50 of these, and my eye was drawn to the pattern. Now, I thought Guids were entirely random, except that the chance of 50 page template IDs all having a 4 at that character was infinitesimal.
Weird. Except it turns out that they’re not random. I had no idea that there are different versions of guids, or than that character defined the version of the GUID.
This requires a test, so I wrote a program to print Guid.NewGuid() a lot:
All of them are 4s.
- GUIDs aren’t entirely random.
- They might not be very random at all, looking at some of the other GUID versions.
- Which is why they shouldn’t be used as a source of entropy for encryption.
- I still have things to learn.
We have been having a problem with Sitecore in Azure PAAS – it appears that when auto-scaling scales out, App Services are being put into rotation before they have started up. This is causing all sorts of weirdness.
Sitecore support recommended making sure that we had Application Initialization configured. That seems a good idea. I’m not sure why the guys who set up this instance didn’t; perhaps they were unaware of it (and to be fair, it’s something I’ve not looked at before).
Continue reading “Application Initialization for Azure Service Apps (and Sitecore)”
Google Tag Manager (GTM) is a tool that lets marketers put code for tracking users/analytics onto a website without having actually change the code of the website. Essentially, Google Tag Manager will inject the code, after the page has loaded. “Tags” in this instance aren’t just #hashtag, but are snippets of code that can record data and send it to third party services.
When uninstalling Sitecore 9, sometimes you have to delete the services it creates in Windows by hand.
The command for this is:
sc delete [service name]
Service name can be found on the properties of the service. Don’t forget to delete them both.
We all have to get zero errors when compiling solutions, but we all aim for zero warnings, too, right?
Supressing Code Analysis errors is easy enough – and I recommend using code analysis rules (though you don’t need to enforce ALL of them. Consider what rules are relevant.)
However – what do you do if you have errors like this, it can be a problem:
6> blah\Caching\TokenCacheItem.cs(39,48,39,65): warning CS0067: The event ‘TokenCacheItem.DataLengthChanged’ is never used
The thing is, I need DataLengthChanged to fulfil an interface. But yes, it isn’t used. Oh, the conundrum.
What I hadn’t realised was you can disable compiler warnings (steady – careful!). See: https://stackoverflow.com/questions/3820985/suppressing-is-never-used-and-is-never-assigned-to-warnings-in-c-sharp/3821035#answer-3821035
We’re instructing the compiler to disable the warning for rule 0067, which is the error given above, and then immediately after the problematic line, we’re re-enabling it.
And as noted on the post…
Caveat: As per the comment by @Jon Hanna, perhaps a few warnings is in order for this, for future finders of this question and answer.
First, and foremost, the act of suppressing a warning is akin to swallowing pills for headache. Sure, it might be the right thing to do sometimes, but it’s not a catch-all solution. Sometimes, a headache is a real symptom that you shouldn’t mask, same with warnings. It is always best to try to treat the warnings by fixing their cause, instead of just blindly removing them from the build output.
Having said that, if you need to suppress a warning, follow the pattern I laid out above. The first code line, #pragma warning disable XYZK, disables the warning for the rest of that file, or at least until a corresponding #pragma warning restore XYZKis found. Minimize the number of lines you disable these warnings on. The pattern above disables the warning for just one line.
Also, as Jon mentions, a comment as to why you’re doing this is a good idea. Disabling a warning is definitely a code-smell when done without cause, and a comment will prevent future maintainers from spending time either wondering why you did it, or even by removing it and trying to fix the warnings.
Roll on zero-warning compilations.
Finally, the web is really moving to using HTTPS (thank you Google/Chrome!) This change is long overdue, and it will be good to finally be able to eliminate the vulnerability of HTTP.
“But“, cry some customers, “I don’t wanna pay for certificates!“.
Bought TLS certificates, well, cost money, and often non-technical customers don’t see any value in them. All they see is cost. If only there were a free alternative.
Good news! LetsEncrypt freely offer free DV certificates for free. That is, in fact, their raison d’être. Bad news – they only last 90 days. Bugger. Our support team won’t dreadfully want to have to renew lots of certs every 90 days. If only this could be automated… Wait, it can be!
One of my colleagues mentioned using WinAcme to get/renew certificates with LetsEncrypt, so I thought I’d give it a go. How hard can it be? Continue reading “Using LetsEncrypt with IIS via WinAcme”
So, the NCSC has been running a study on the prevalence of the ‘Top 1000 Passwords’. It’s useful stuff, but I wondered – just how frequent are these passwords? How can they know? Where did this list come from?
I noticed, for example, that the list included baseball, which I gather is a degenerate form of rounders. It’s certainly not what I’d expect on a UK-centric list of passwords. Similarly, chicago, and redsox were unlikely. (There are, however, cricket and wanker, so it isn’t an entirely Americanised list).
I also noticed some passwords – like rasdzv3 – that I couldn’t see any obvious reason for being particularly popular.
Anyway – I wondered – how frequent are these? What was the most frequent? Continue reading “A brief analysis of the NCSC’s “Top 1000 Passwords” list”