TableLayoutPanel – doesn't autosize or autoscroll correctly

I’ve been doing some Windows forms programming lately – not really my thing, but needs must. I’ve got an application which needs to dynamically create a form at run time, and so I’m using a System.Windows.Forms.TableLayoutPanel.

All the controls contained by the TableLayoutPanel resize automatically, and the TableLayoutPanel automatically provides scrollbars. “Great!”, I thought, “This will deal with large forms nicely.” Wrong!

The problem is when form has too many fields – it becomes deeper than the display area of the TableLayoutPanel. This is fine – the TableLayoutPanel should automatically add a vertical scrollbar, and adjust the size of the controls it contains to fit in the smaller area. Except it doesn’t. What I actually get is this:

Problems with TableLayoutPanel

Notice that I’ve got a vertical scrollbar, but it hasn’t resized the child components. This, they overlap some of the fields – and I now have a nice horizontal scrollbar. Arse.

I eventually found that others had had this problem and told Microsoft about it. However, they’ve decided not to fix it. Thanks Microsoft, that cost me an hour and half – it’d be wise to fix it. I found the answer from the same link – add a vertical scrollbar’s width as padding on the right of the TableLayoutPanel.

TableLayoutPanel – doesn't autosize or autoscroll correctly

Adding Snippets in Visual Studio

I keep forgetting how to do this, and have to do so every time I want to create a new SharePoint workflow.

  1. Open Visual Studio.
  2. Right click on toolbars and ‘Customize’
  3. Click ‘Commands’ Tab.
  4. Click on the ‘Tools’ category.
  5. Add the ‘Code Snippets Manager’.
  6. Add ‘C:Program FilesMicrosoft Visual Studio 8Xml1033SnippetsSharePoint Server Workflow’ to your snippets
  7. It should be straight forward from there…

Comments from my old blog:

Are you using your own project templates? I’m curious because using the MOSS Workflow Project templates (both state machine and sequential)–the templates that come with the MOSS SDK–and I have the snippets already available to me.

Anyway, just throwing that out there.

By Peter {faa780ce-0f0a-4c28-81d2-3 at 17:30:01 Monday 10th September 2007

Nope, these are the ones out of the SDK. I’m not clear why, but in some VMs that I use the snippets aren’t immediately available after I install it. And normally the ‘Code Snippets Manager’ is already available too – but not always. I haven’t figured out what factor causes these things to occur :/

But yes Peter, you’re right – normally they should be available.

By Andy B at 11:16:21 Tuesday 11th September 2007

Adding Snippets in Visual Studio

Benchmark: Speed of Encryption and Decryption using .NET Framework classes

I was reading about security stuff in the .NET framework, and dealing with cryptographic classes in it, and it sort of set me wondering. Here are all these different encryption classes, with different block and keys sizes, cipher modes, all that jazz – but what are their performances like? Specifically, I’d read something saying how some ‘weaker’ encryption algorithms are better (in some speed-critical applications) ‘cos they’re faster. I wondered how much?

Thus, I decided to benchmark the Symmetric alogrithms in the .NET Framework – DES, Triple DES, RC2 and Rijndael. To make life interesting, I thought I’d try them with differenct key sizes and block sizes, and cipher modes.

So, I’ve linked to definitions of these factors, but for those who don’t want to read vast chunks of Wikipedia, here are my (simplified) definitions. For anyone really interested in learning how to program with encryption properly (and in learning why their 128 bit key probably isn’t 128 bits strong) I can strongly recommend the book ‘Practical Cryptography’ by Bruce Schneier and Niels Ferguson.

Symmetric ciphers are ones like you used when you were a kid. You have some operation that turns a message into garbage, and then the reverse of that operation turns that garbage into a message. Some algorithms don’t have a reverse – they are asymmetric ciphers, and are a whole different kettle of fish.

Keys are the password you use with your cipher. For example, if you’re cipher as a kid was to shift all letters in the alphabet, then the key might be the numbers of characters shifted. Big keys are harder to break. Think of it as being just like a password or PIN number. If I tell you that my PIN is 4 digits, you might be tempted to guess all 10,000 possibilities, and on average you’d figure my PIN out after 5000 tries. If my PIN was 8 digits, then there is 100,000,000 options – and you’re less likely to try all the possibilities, eh?

Block sizes. Well, okay, some ciphers work on blocks of data, rather than each byte (or each ‘letter’). These are block ciphers. There are also stream ciphers, where each byte is encrypted one by one. Anyway, in block ciphers there is a limit to how much data can be encrypted without ‘leaking’ information. Larger block sizes can encrypt more data without that leakage. (That’s not to say that the block has been decrypted, but an attacker could start to learn things about the contents of that block.)

Cipher modes don’t really have a parallel with how you did codes as a kid. I guess I would describe it that if the cipher is about how you make an apparently random set of bits, then the cipher mode is about how you then use them. There are lots of different modes, but the .NET framework classes only seem to support 3 – ECB (Electronic Cookbook), CBC (Cipher block chaining) and CFB (Cipher Feedback).

So, what are the algorithms:

  • DES – An old encryption standard, now regarded as offering poor security, but so widely used that it is still in operation as a legacy system.
  • Triple DES – An improved version of DES, made by essentially applying the DES 3 times.
  • RC2 – A moderately old encryption algorithm. Flexible key lengths, but short block size.
  • Rijndael (aka AES) – The latest encryption standard. The Rijndael algorithm was selected from several as part of a competition. It wasn’t regarded as the most secure, but it was quite quick. The Advanced Encryption Standard (AES) is actually a subset of Rijndael.

The Test

I found a nice text file – “The complete works of Shakespeare” – as my test data.

For each algorithm, for each mode, key and block size, the test program encrypted and decrypted the data twenty times, and reported the average ‘time’ for each operation. I was using the Win32 QueryPerformanceCounter function, which doesn’t really return a time so much as cycles. However, all the tests were done on the same machine, so they’ll do just fine for comparison purposes.

Results

With the several factors tested, there are many ways of slicing the data. It’s worth noting that these results are pretty rough, as the times taken also include file IO operations, and with any modern PC there’s also something else happening at any single time. Also, the times are the total time taken to encrypt and decrypt, which might not be the same for each operation. Treat the results as a loose guide.

First let’s look at the raw results. You can get the results here (Excel file) –EncryptionTimes.

Unsurprisingly, DES is fastest – given it’s age, and the low level of security it offers now. Triple DES with the longest key it supports was generally slowest. RC2 covered the full range of results, which is also unsurprising, given it’s flexibility, and Rijndael sort of falls in the middle.

The first thing I noticed was how few tests there were using DES or Triple DES. RC2 and Rijndael are much more flexible in their use.

Next, it’s interesting to note that RC2, DES and Triple DES using Cipher Feedback Mode (CFB) were all very, very slow. They all seem to suffer very badly using CFB.

So, excluding the CFB results then (as they are so exceptionally slow), what do the other results show? Well, Rijndael does not suffer so badly in CFB, although CFB is slower.

ECB appears slightly slightly faster in the table, though examining the CBC Mode Graph shows little difference.

To compare the modes, I looked at just the operations done with the Rijndael cipher.

Again, we see little difference between ECB and CBC, so I guess there’s no reason not to use the more secure CBC mode over ECB (for an example of it’s weakness see here). Also, for 128 bit blocks (as required by the AES standard), CFB is as quick as ECB.

Rijndael is not great in CFB mode with blocks of longer than 128 bits.

Okay, so let’s focus on just one mode (CBC) and look at the results shown in the CBC Mode Graph. Well, it’s interesting to note that RC2 with a 112 bit key was quite quick – faster than with some shorter keys. However, it’s only about 6.5% longer to use 128 bit Rijndael – which is a key that is 14% longer. Doubling the key with Rijndael to 256 was only 10% longer than 128.

Longer blocks take longer to encrypt and decrypt. 64 bit blocks seems a little short these days, only being safe for up to a couple of hundred megabytes. 128 bits seems more reasonable. 256 bits seems excessive. Rijndael seems to have little penalty for using 256 bits over 128, though if you do, you’re not using an AES standard encryption.

Conclusion

DES and Triple DES are old. DES isn’t secure, and Triple DES doesn’t seem to offer much given Rijndael and RC2 being much faster than it.

In terms of cipher modes, these classes only seem to support ECB, CFB anc CBC. ECB is generally regarded as being a poor mode – it’s not very secure. CFB was typically slower than CBC, and as Microsoft have already implemented the classes, some of the advantages CFB (i.e. encryption and decryption being identical operations) have been lost.

So, then examining Rijndael in CBC mode, well, there is little penalty for using 256 bit keys or 256 bit blocks. However, it’s probably worth sticking to 128 bit blocks as 1) it is plenty, and 2) it is AES compatible.

All in all, I was surprised by how similar a lot of the results were for different algorithms, and I was surprised by how slow some of the CFB mode operations were.

To be honest, I can’t really think of a reason not to use Rijndael with 128 bit blocks, in CBC mode. Unless time is a really critical factor, 256 bit keys are stronger. Finally, the RijndaelManaged class in the framework is a managed class, rather than a wrapper for a COM object.

So, the winner is Rijndael!

Comments from my old blog:

This is a very informative article. I have just started looking into encryption, and I have come across nothing on the internet that is as concise as your article.
Will you be doing something similar with asymmetric encryption as well?

By Firoz at 06:22:04 Friday 9th February 2007

Yup, well, at some point. The truth is, in the .NET 2.0 framework, there isn’t a lot of other asymmetric algorithms. RSA is about it. I think in the .NET 3.0 framework there is elliptical curve, and that would be interesting…

So, yes, when I get around to it.

By Andy B at 10:09:43 Friday 13th April 2007

Benchmark: Speed of Encryption and Decryption using .NET Framework classes

RSACryptoServiceProvider – "Key not valid for use in specified state"

So, I was trying to do some encrypted comms over TCP, only rather than using SSL, I thought I’d try to RSA encrypt and decrypt at client and server myself. I know, it’s re-inventing the wheel – the point is to get to know the APIs though, and it seemed a good exercise.

I started getting an error though – “Key not valid for use in specified state”. Odd. I was importing the key from an XML file, using the FromXMLString() function. It all seemed to work just fine. So, WTF? It’s not like the code is complicated:

RSACryptoServiceProvider rsa = new RSACryptoServiceProvider();
rsa.FromXmlString(publickey);
byte[] encryptedData = rsa.Encrypt(data, false);

So what gives?

Well, eventually, I tracked it back to this – I was trying to send too much data. Not very much – less than a couple of hundred bytes – but this was too much.

The obvious thing to do was change the way this works to match the way it’s supposed to work – use RSA encryption to transfer the key to a block cipher, and then encrypt all your data with that block cipher. But I couldn’t be arsed – I just wanted to see the asymetric encryption work – so I reduced my data…

Comments from my old blog:

Sounds like you where in the 70-536 Self Study book from Microsoft. In chapter 12 doing some suggested practices.

Anyway.. that’s where I am and your message here on the blog helped.

I too will send a smaller file 😉

By Micke at 17:13:02 Monday 24th September 2007

Yup, I think I was. It was a bit daft that they didn’t mention the limits on the size of the data.

But that book has a _lot_ of issues.

By Andy at 10:18:03 Thursday 27th September 2007

RSACryptoServiceProvider – "Key not valid for use in specified state"

The speed of collections and For loops in C#

Some of the .NET training I’m doing started me wondering about speeds and things. So, I wrote some testing and turned up some interesting things…

First off, I tried comparing the speed of populating and reading from generic and normal collections. I found that Generics are much faster to populate as well as read from. I’d expected the latter (no type conversion needed), but not a better speed at population. I guess this is because the types can be checked at compile time. I tried this with both a value type (so there might be boxing/unboxing), and a reference type – each time the result was the same, non-generics took ten times as long as generics.

Populating a generic list is twice as fast if it has its capacity assigned. E.g.

List<SomeObj> myList = new List<SomeObj> ( 10000 );

Populating a non-generic list is actually slower if it has its capacity assigned. I have absolutely no idea why.

FOR Loops are slightly faster than FOREACH loops. However, the difference is piddling, so I’d actually recommend not worrying. Out of preference, I’ll use FOREACH, ‘cos it’s easier to read.

Looking at converting types (well, an integer in most of my tests) I found that:

  • AS is slightly faster than a cast
  • (cast) is much faster than System.Convert

It’s worth noting that if a conversion fails, AS will just return NULL, whereas a cast returns an exception. Raising an exception is slower than testing for null. Therefore, AS has a definite speed advantage, and hence why you shouldn’t handle expected exceptions using, um, exceptions. Instead, test something and then deal with the exception case. For example, you something like TryParse. (Actually, I should give that a whirl, see how long it takes.)E.g.

int w = 12;
Object o = w;

//fastest conversion and error handling
int x = o as int;
if( x == null) { };

//Okay speed, very slow error handling
try {
int y = (int) o;
} catch ( InvalidCastExpection e ) {}

//Don’t do this
try {
int y = System.Convert.ToInt32(o);
} catch ( InvalidCastExpection e ) {}

I’ll get back to you all about the TryParse thing.

The speed of collections and For loops in C#

Developer Day 4

So, I went to Developer Day 4, and it was very good. I’m now looking forward to WebDD. So, what of the talks at this one…

I went to Ben Lamb’s “How to write crap code in C#”. It was pretty simple, but showed just what you can do to compromise performance. Actually, the biggest message I got from it was that it’s worth testing some of the standard ‘performance tips’ – which was funny as I did that just last week.

The other notable talk was “Securing ASP .NET Websites” by Barry Doran. Apart from it being nice to listen to someone with a proper accent, it was a good high level view of the decisions that you have to make when building a website like that. Some of it was new, some of it was old hat, and it was nice to see the reasoning too. He’s a characterful speaker too.

Also, the talk “Securing Web Services using WS-*” by Chris Seary was a good ‘un – finally, I have an answer to the question “Why Bother? Why not SSL or IPSec”. Nice to have a bit of a higher level view explained

In addition, I went to one about “Using and Abusing Reflection” – which seemed a bit too specialised to be of use generally – and making fun of the Irish isn’t a great laugh. Our HR manager would have me warned if I ever did something like that – and quite right too.

Finally, there was the “Technet Highlights” talk, which was great fun, but pretty content free. It did say it wouldn’t be techy. I guess I’d just wanted to hear more of what the buzz was in Barcelona, what things are hot and what’s not (and what the stylish developer will be coding in this season). Still, they were generous with the swag – I’m not sure who they mugged to get all that.

The conclusion – I’ll be going to the next one (unless I’m promoted into management and never touch code again (Not likely))

Comments from my old blog:

Thanks for swelling my ego; I’m glad you enjoyed it and found it useful.

By Barry Dorrans at 21:09:42 Monday 4th December 2006

Developer Day 4

What the hell are the System.Drawing.Color predefined Colors?

The .NET framework has a number of predefined colours in the System.Drawing.Color class. You ‘d think this would be easy to iterate over – after all, there’s quite a lot of them. I can see that that would be useful for, say, drawing palettes.

Well it ain’t easy. They’re not an enumeration, so you can’t iterate over them. Instead, to get a list of the colours, you’ve got to do something like:

List<Color> colorList = new List<Color>();

Array colorsArray = Enum.GetValues(typeof(KnownColor));
KnownColor[] allKnownColors = new KnownColor[colorsArray.Length];
Array.Copy(colorsArray, allKnownColors, colorsArray.Length);

foreach (KnownColor c in allKnownColors) {
Color col = Color.FromKnownColor(c);
if(( col.IsSystemColor == false) && (col.A > 0)) {
colorList.Add(col);
}
}
That’s a lot of work for something obvious like iterating over colours!

What the hell are the System.Drawing.Color predefined Colors?