I'm at the Future Of Web Apps at Excel London. The whole world is twittering it. We seem to have just about swamped WiFi, 3G, GPRS and Twitter. Evidently, geeks don't scale.
Rather than join the twitter flood with every comment on the talks, I'll post them here and update this post.
Languages don't scale
Scaling == IO - else ur doin it rong. Should never be CPU-bound.
Therefore language is irrelevant to scaling.
- Don't share: Bottlenecks & SPOF!
- You should be able to *lose* a part of your cluster without your customers noticing.
- Keep your coders happy. They code faster & better.
Flickr hate testing? Flickr are muppets then!
Split the work into smaller chunks (don't try and do it all in the DB?)
Unix load cascades (yeah, we'd spotted that), so you don't get a nice failover warning.
Cache - and use memcache to do it. Digg(?) uses 1TB of memcache.
Cleaning up the cache (keeping it current) is hard work.
Can acheive 200-400% scaling by using *smart* caches.
Herd effect - Cache key expires, all webservers pounce on that at once and try and rebild it - a type of race condition.
Use "expiration jittering' (cf TCP fallback?)
Use queues! ("starling" at twitter; "gearman"? at digg). All large sites use it!
Partition data (again, share nothing). Compare horizontal / vertical partitioning (vrt more common).
Again, refer memcacheDB.
Summary: Scaling is in the architecture, not the language.
Ops and Dev can't work independently of each other to scale!
Dopplr; Made of messages
Massive backend integration: asynchronous.
Only do essential work up front; queue the rest.
Read: (highly recommended): Enterprise Integration Patterns (Hohpe, Woolf et al)
Use polling to get progress state from queued systems - share the %age in shared memcache.
Result - no single path of control.
Message queueing lends itself to cloud scaling.
We got us a live one here...
The future of mobile is about mobility, about computing anywhere (ubiquitous computing).
The device is secondary; it doesn't have to be one you carry around with you, it needs to be something you can use where you are.
Presence and location; devices can adapt to where you are. States and events are a part of presence.
See: tonchidot video - crowdsourcing.
Interoperability and open standards are essential.
It's got to be easy - internet TV needs to reach the same devices & quality as other formats.
Needs to be compelling and interactive.
(This is more a "how to sell your digital TV channel" thing than a tech one, and I'm also trying to fix some very old blog sync code that this is running on at the same time, hence the reduction in detail!)
Blowing up the social web
All sites seem to ask for the argh-too-many questions on signup. This is a barrier to entry.
"Invite your friends! Give us your email password!" Really not a good plan. Need APIs and open standards.
Problem is of "finding people you know and sharing with them".
It's the interoperability, st00pid!
Too many services / social network is not scalable!
We need "distributed social networks": Very true, but it's not a trivial problem.
Side thought - not many small and unconvining startup stalls here this year.
We now have "the open stack" (last year's fowa was definitely looking at early experiments, starting to come together more now).