I was able to attend DDD South West (www.dddsouthwest.com) over the weekend held in Bristol. The conference is a free one day event open to all but mainly focusing on elements available on the .NET platform. It’s a great way of being able to hear some excellent speakers talk on areas that they are passionate about and who you would not normally be able to see.

Sessions that I attended included:
- .NET Collections Deep Dive by Gary Short (http://garyshortblog.wordpress.com/)
- No More Passwords by Jimmy Skowronski (http://jimmylarkin.net)
- Performance and Scalability, the Stack Exchange Way by Marc Gravell(http://marcgravell.blogspot.co.uk/)

———————————————————————————————-
.Net Collections Deep Dive
A session convincing us that there is more to .NET collections than just lists.

Lists:
- AddRange is better than Add inside a loop as it cuts down on the number of Array.Copys needed to be called when adding new items and expanding the underlying array (this really comes into effect when adding more that 10000 elements to a list).Alternatively you can set the capacity of the list when constructing to reduce this further
- Using a list initiator and specifying the elements within the list eg new list {item1, item2} is the same as calling Add multiple times
- List.Remove performs an IndexOf before calling List.RemoveAt so if you know where in the list your item is just call RemoveAt initially.
- List.Sort uses QuickSort as it’s underlying sort algorithm. This is the fastest sort algorithm for general purpose sorting as it is (nlogn) for best and average case but (n to the power of n) for worst case. What’s the worst case? An already sorted list – better to call List.Randomise before calling sort to remove this problem.
- List disadvantages: Add, Insert, Remove all O(n) operations

Linked List
- Double linked so that each element is linked to the element before and after it
- Add, Insert, Remove all O(1) operations however O(n) on lookups

Dictionary
- Performance depends on the result of Key.GetHashCode. If you are using your own objects for the key you need to implement this method yourself. Also need to override Equals method to prevent hashcode collision as every key in a dictionary must be unique.

Lookup
- Keys do not have to be unique
- Can be created using lists of tuples and converted to lookups

Collection Version Numbers
Every regular collection has a version number, this is the element that prevents you from iterating over a collection that you are changing. This traditionally causes problems in multi threaded environments due to thread locking and updating. Concurrent Collections have been created to solve this.

Concurrent Collections
- New collections in .NET 4.0
- Implements IProducerConsumerCollections which provides the methods TryAdd and TryTake
- TryAdd: attempts to add item to collection and returns boolean result
- TryTake: attempts to remove item from collection and returns item
- Blocking collections: uses Add and Take. TryAdd and TryTake also available with timeouts

Summary
- List: good general purpose collection. Construct to size if possible, prefer AddRange to Add. Be aware of issues with QuickSort (Google QuickSort killers for scernios)
- LinkedList: fast insert/remove
- Dictionary: fast lookup
- Lookup: multi key values
- Concurrent Collections: thread safety

————————————————————————————————-

No More Passwords – or how to use OAuth

Most public access websites that you come across that want you to interact with them all seem to have some sort of account creation area. While it’s probably fine to have a separate account on Google and Amazon it gets really tiresome when you are trying to download a software demo or take part in a forum. Especially when you consider the amount of passwords that you have to remember and the amount of media stories when hackers manage to get lists of account details from insecure  sites.

Fortunately OAuth presents an opportunity to hold an account with a provider eg OpenId, Facebook, Twitter, Google, LiveId, and then use those credentials across multiple websites to bring your existing account details with you.

To incorporate this into your own sites .NET provides a series of libraries under WebMatrix.Security that allows you to add providers with only a few lines of code. LiveId, Facebook, Twitter and YahooOpenId all come out of the box but Google can be easily added.

The steps involved are to register your site with the individual provider who will give you a token that you can add to your global.asax page and then the call on your site to perform the login is as simple as OAuthWebSecurity.RequestAuthentication(provider, postbackUrl).

One of the items of data that comes down with the request is a unique id representing the user that can then be stored in the application database allowing for persistence of settings/data etc. For our public facing customers there should no longer be a need to roll our own account management system.

Jimmy has provided a demo of how this can be achieved and as his presentation was a website you will be able to get the full notes at http://jimmylarkin.net/post/2012/05/27/DDD-South-West-OAuth-Session.aspx

————————————————————————————————-
Performance and Scalability, the Stack Exchange Way

Marc started out by explaining that the traditional answer to performance issues of throw another server at it doesn’t work for websites for the vast majority of the time- highlighted by StackOverflow’s own experiences where they serve up seven million page views a month and their servers are only running at 10% CPU capacity.

He also noted that IIS profiling doesn’t always show where the issues lie and for that you need something reporting within the code. This is how MiniProfiler was born- an open source profiler for MVC and ASP.NET websites that allows you to tag regions of code and report on the timings within them. It can even go to the database level and report on the SQL that was actually called- handy for Entity Framework and Linq statements.

It’s also lightweight so unlike other profilers it can be left to run in a production environment and only enabled when needed. The website effectively shows its usage so check out http://miniprofiler.com/ for more details.

The second tool that Marc showed was Dapper (http://code.google.com/p/dapper-dot-net/) which is a  lightweight mapping tool that is very effective for sites with large reads. This was created when the StackOverflow team noticed that some Entity Framework calls took 400ms to run instead of the standard 4ms and the issue turned out to be the mapping process between SQL results and custom objects within code.

I believe that both of these tools would provide benefits both to new and existing projects within and should be investigated further.

————————————————————————————————-

Presenters at DDD events are normally very good at making their presentation content available on their blogs so it would be worth checking out the agenda at http://www.dddsouthwest.com/Agenda/tabid/55/Default.aspx and looking up any of the speakers whose sessions catch your interest.

If any of these has captured your interest in attending DDD events, the next one that I know of is DDD Reading on Sat 1st Sept though DDD North is about to announce it’s call for speakers. Keep an eye on the official DDD site at http://developerdeveloperdeveloper.com for more details.

About these ads