First of all if you’re having trouble viewing my World Cup site, sorry. I wanted to show what was possible using the best of emerging technologies with Kwwika so it probably looks very broken on anything but the latest generation of desktop and mobile browsers. It’ll work just fine in IE8, Firefox 3.6 & maybe Opera 10, though I still have to test that one. But if you want to see it at its best use Safari 5 or Chrome.
When you first arrive the site attempts to use Geolocation to find out where in the World you are (with a fallback to Geoip). It tries to match your location to a World Cup team and begins streaming Live Tweets about that team using the Kwwika API. Sorry Scottish people but as it reports you as being in the UK you get England tweets.
We also populate the ‘Now Playing’ section with the Country name & flag and show the team’s percentage chance of winning. This is updated every day from the figures released by Norwegian computing center and 2 Sporten.
As the Tweets come in they are added to a first-in-first out queue so you’re not drowned in them and they are displayed at a brisk, but readable pace. We then take the Twitter ID of the user who made the tweet and add it to the roster of players. Their avatar will then be shown in the ‘Team’ section, and it links through to their Twitter profile page.
If you’ve enabled the Map Display we also get the user’s Geolocation (when available) from the Kwwika API and use it to plot their location when we show their tweet. However I noticed during testing that not an awful lot of people geocode their tweets, which is a shame :( When you first enable the map it’ll attempt to start where we think you the visitor are, thanks to the Geolocation or Geoip lookup we did right at the start.
While all this is happening we have a second and a third subscription firing up and polling getting fed push data by Kwikka. This time for match information from opta, which you can see at the top of the screen. This should show you the upcoming matches in the competition.
We then take the visitors country and try and find some more data from different sources. We start regularly polling the Twitter search API looking for World Cup tweets from the selected country which also contain Twitpic links. We then use the Twitpic’s API to display thumbnails of any found images under the ‘Highlights’ section. Clicking any of these thumbnails will take you to the large image on the Twitpic site.
(I would like to have Kwwika supply image links found in Twitter as well, but I didn’t want to give the game away by requesting it!)
The instant replay is obviously a Youtube video. This is found using Google’s Youtube search API and should hopefully be relevant to the selected country as well! (There is a second Youtube video hidden on the site for those who want to find it)
Up to this point everything has been done client-side. Everything. Tweets, image and video searches. Fixtures, maps - the lot. You could merge the whole bleeding thing into a single HTML file and run it from a thumbdrive!!
Our first piece of of server-side code is the ‘Commentary’. A small PHP script fetches (and caches) RSS feeds from blogs and newspapers such as FIFA, the BBC, the Guardian etc. These RSS feeds are then turned into JSON and fed to the client-side code which displays a different random item every dozen seconds or so. It is also the only section which isn’t filtered when you pick a new country.
You have some controls on the page. The main ones being the flags at the top which let you change which country’s tweets are being shown, and the language dropdown.
Anyone who has worked with Twitter data will tell you that you ca’t trust the language people say their stream is in. They either set it wrong or don’t bother to change it from the default. So to get around this I detect the language of each Tweet in the client using Google’s language API. Turn it on and only Tweets recognised as being in the language you pick will be shown. It has no effect on Tweets already shown or the image/video searches.
The ‘Map Display’ I mentioned briefly. This one comes with a warning as should there be a sudden upsurge in the number of people geocoding their tweets it would have to draw a heck of a lot of markers, which can be CPU intensive. Chances are very very slim. But better safe than sorry.
I have to say a HUGE thank you to Kwwika (and the sponsors Tellylinks) for the API and also for agreeing to me releasing my code under the LGPL. Which I will do after the competition has closed! And don’t forget to click the share button to let your followers know about it as well!
Data sources used in this mashup
Kwwika - live Tweets, user’s geodata and fixtures (via opta)
Yahoo - YQL Geo library
Google - Maps, Language and Youtube APIs
Norsk Regnesentral & 2|Sporten - team percentage chances of winning
BBC, FIFA, Guardian, World Cup News & World Cup Buzz - RSS news feeds
Don’t forget you will be able to grab the source from Github (I’ve created this empty repo for it) so you might want to go apply for that Kwwika Beta access now (you can develop on localhost even if you don’t get an invite though!).
And please, please, please remember…the share button! Oh, and geocode your bleedin World Cup Tweets people, the map works so much better with them! ;)
As you can tell from my very accurate measurements I’m a very precise cook.
2 big bags of Maltesers
1 Small packet of digestive biscuits (250g pack)
Mash up most of the pack of biscuits (leave 4 or 5 in the pack) and 1 of the bags of Maltesers into crumbs. I normally put them into a sturdy ziplock bag and hit them with a saucepan.
Put a couple of large knobs of butter (70-100g) into a hot pan and melt, careful not to burn it. Add in 3 big tea spoonfuls of Golden syrup and the same of the cocoa powder and give it a good stir.
Take it off the heat and chuck in your biscuit and Malteser crumbs. Mash it all together and check the consistency. It should be a bit sticky and not dry. If it is too dry you didn’t melt enough butter. (Quickly scrabble and find another clean pan to melt some more butter to add to your mix).
Once you have it all mashed together mix in a couple handfulls of whole Maltesers from the second bag and stir together again. While doing this shout for your other half to line a shallow loaf pan with cling-film as you forgot to and now your hands are covered in gunk. (The cling-film makes it easy to get the buns out of the pan after they’ve set)
Spread the mixture in the pan, it should be about an inch thick. Press some more whole Maltesers into the top. Cover the top with some more cling-film and put in the fridge to set. Should be ready to eat in a few hours. Just prod it every now and then until it has gone hard.
I’ve just bought my first @grabaperch license. I haven’t installed it yet but I’ve already ran into a minor issue. Much like Pictos I feel like I am being forced into bad practice by restrictions put upon me by Perch’s licensing.
Each license is for a single site, but can be associated with 2 domains. Live and testing. But you should have a minimum of 3 domains.
Where you do your actual development. Each developer working on the project should have their own local development environment where they can build without stepping on each others toes.
When a block of work is completed it is then placed on the staging server. This is where the client can view the work and sign-off on it. The only code that makes it onto staging should be code ready for production but not yet approved.
Our live site. Public facing and approved by the client.
These are only the basic 3, there are many more ways of working which would require even more domains. But it would still only be 1 site. I know their terms & conditions outline all of this, but I would like to suggest a small adjustment. Do not count tlds. That way you could have:
example.local [local development]
example.branch [An offshoot of main development]
Solves all the problems. The software is still protected, without a hosts file entry or your own DNS server the custom tlds are not going to resolve. Ok, so someone could run .co.uk, .com, .es, .nl sites from same license, but at some point you have to think of your fee paying customers, even if it might things a little bit easier for the dishonest ones.
However, I do have to applaud them for not being as draconian as certain video games, which assume everyone is thief if they can’t connect to DRM servers.
If activation can’t complete, assume honesty. That’s how I roll.
So what about it @grabaperch, do you think it might be an adjustment you are willing to make?
I’ve added this suggestion to Perch’s official suggestion forum. You can view it here
I'm sorry, but my quota of shit has been filled so I won't be accepting anymore.
your constant survey ‘courtesy reminders’ are a bit much. Do you know you’re averaging an email a day at the moment?
It used to be ok when I would get 1 a week or less. But every sodding day? I’ve got better things to do thanks.
So for that reason please remove me from any panels, groups, committees, mailing lists, sowing circles, or whatever non-sense I am getting emails from and never contact me again.
Whenever someone is nice enough to agree to your occasional emails, make sure they stay fucking occasional or they might stop being nice.
I think I might actually have to follow through on this tweet.
For some reason the larger techie get togethers in Glasgow (& Edinburgh) seem to have been taken over by sales/marketing type people pushing their products or yelling about how
'Every lean RIA needs to value add its microblogging and recession proof its success metrics by employing the Obama effect to create a paradigm shift in its ROAS and maximise the value proposition of the long-tail buyersphere dollarization.'
The developer community needs to step-up and reclaim these tech events. You have all probably read my thoughts on the “Barcamp” event in Glasgow. I detest referring to that event as a Barcamp as it was not, in my opinion, a Barcamp.
BarCamp is an international network of user-generated conferences (or unconferences). They are open, participatory workshop-events, whose content is provided by participants. The first BarCamps focused on early-stage web applications, and were related to open source technologies, social protocols, and open data formats.
The procedural framework consists of sessions proposed and scheduled each day by attendees, mostly on-site, typically using white boards or paper taped to the wall.
While loosely structured, there are rules at BarCamp. All attendees are encouraged to present or facilitate a session.
All the above from the wikipedia definition of a Barcamp. Does any of that sound like what went on at the Glasgow event? Fuck no. Just look at these guys. Do you see a single suit? That is the first Barcamp and that single image sums up what is great about Barcamp and also shows exactly why the event in Glasgow is not worthy of the name.
That is why I want to reclaim Barcamp for the techies. For every person who loves code, adores open source, salivates over new technologies. Who can talk for hours about web standards and has heated debates about UX, but who doesn’t give a flying fuck about the marketing and sales bullshit currently being fed to us under the guise of being tech.
So, time for me to 'put up or shut up'. To this end I will be organising an unconf/Barcamp. An event by a techie for other techies. There is obviously a lot of work involved in this, probably more than I realise. But I am determined to stop just bitching and do something!
In the coming days I will be launching the event site and collecting registrations from those who might be interested in attending, so please keep an eye on this blog or my twitter feed for more updates.
Just checked my search stats. According to Google I’ve done 31,407 searches since July 5th 2006. That’s on average ~22 searches a day. Almost 1 an hour, every hour. For nearly 4 years.
There doesn’t seem to be a massive variation in what time of the year I do most of my searching. A fall-off in December is most likely due to me being off work (I normally take 2 or 3 weeks to go back to Ireland to visit family) and February, well it’s a short month.
No surprise I search more during the week than at weekends. Or that most my searching is done during business hours.
I’m not going to work it out exactly but lets take it that 60% of my searching is done between 9am and 5pm, Mon-Fri. So out of those 4 years we lose about 416 days.
(60% of 31407)/(1451 - 416) = 18.207 searches per day.
But our ‘day’ is only 8 hours long now, well 7 if we take out an hour for lunch. So I do 2.601 searches per hour every work day. Or a search every 23 minutes.
Whoever did these designs is as much a 'Designer' as I am a 'World class sprinter'
I might know how to run, I might even be even to beat someone who is really fucking bad at it. But put me up against someone with the slightest bit of talent and all I’m going to do is embarrass myself, and possibly throw-up at the end.
An email subject should concisely describe the content of the email. It should be short, precise, easy scannable but give a good summary of what the email is about.
This way in 6 months time when I need to find a particular email I can easily scan the email subjects and get a good idea of what they contain. If your subjects are all "Question", "FYI", "Quick request", etc and I have to wade through hundreds of previous emails all like that I am going to fantasize about gutting you with a rusty spoon :)
@aliCM is the master of the short subject.
Thankfully her emails are more conversations and I never have to refer to them again in the future. Otherwise I think I would cry.
Twitter have extended the oAuth deadline until 8 weeks from now as everyone is too busy watching the World Cup to bother changing their code.
Actually it is likely because Twitter is in melt-down due to the volume of traffic it is receiving, with the world cup being on, and developers can’t test their new code! Oh well, it gives @LuvOrHate a few more weeks to live.
I’ve been a fan of Less for a while now. As a developer it is how I think CSS should have always been. Functions (mixins), Extending objects (nested rules), Operations and Variables.
There are already some great tools for working with less. The obvious one being the Less Ruby Gem:
The watch feature of Less.js in the browser is very cool. This auto-refreshes your styles whenever your Less file changes:
Installation of Less.js in the browser is really rather easy as well. Note however the rel attribute on our style tag:
If you use a deployment manager such as Phing this could even be an automated part of your build/deployment process.
“dward cullen pulled up alongside me in his shiny, silver volvo, which was silver and a saab, i think. his well-muscled chest was riding shotgun, wearing a blue-gray waffle knit long-sleeved t-shirt, relaxed fit jeans with contrast stitching in a lightly distressed wash, and an ivory-colored jacket made from the dyed skins of clubbed baby seals. he dressed very well, like someone who wears nice clothes.”—a Twilight review
or #barcampglasgow. I stil don’t know what the right hash tag is/was.
To be honest it was a bit of a let down. I don’t think they quite understood what a barcamp is. The talks were too formal, and felt like sales pitches. With fuck all technical content.
Whitespace - Integrating Tennent’s : A promising start. I enjoyed the Whitespacer’s talk and they have done a brilliant job with the “Tennent’s Hub”. They gave an interesting overview of their project planning/management. But I would have liked to have seen something more in-depth. I know time is limited, but maybe they should have picked one or two key areas and explored those, rather than trying to cover the entire campaign. It came across like a sales pitch at times, which is unfortunate, but understandable as they should be proud of what they have done.
Dog Digital - Is the internet making me stupid? : This talk was like an ADD kid fed a triple expresso. There was no cohesion and they seemed to jump from one topic to another with barely the flimsiest of connections. It was like they’d ran out of things to say about the main topic after 5 minutes so just got a developer to talk about geo-location. To be honest am glad they did as the geo-location bits were their only saving grace. But it to lacked anything techie and didn’t introduce anything new. Quite disappointed to be honest, had hoped for better.
Innovation Digital - It’s not how big it is : Gets the “No shit Sherlock” award. Apparently if websites aren’t usable, people don’t like to use them. Well I don’t know about you, but that blew my mind. A 5 minute skim of any Steve Krug book would be more enlightening.
Scotsman.com - Newspapers in a digital world : Should have been named “Scotsman.com, annual share-holders meeting”. 30 mins of performance figures about Scotsman.com and their satellite projects. Snore. The only mention of any sort of technology was to tell us the revenue it was generating, or its expected growth. As a developer I was enthralled.
Real time worlds - Keynote : Hey, you’re not Spotify!? Real time worlds did an ok keynote. Again a complete sales pitch, but at least they had a cool video of their game.
The real barcamp.
I skipped the football and headed upstairs for the rapid fire Unplugged sessions. How fucking glad am I that I did. Finally something resembling a barcamp! Woohoo. Informal sessions with a heavy leaning towards tech. This is what the whole thing should have been. And I can not commend those speakers enough.
They did more with 5 minute slots than any of the speakers above did in 30. They engaged their audiences, introduced some interesting tech and salvaged the day for me. I’ll list the websites of those who impressed me the most, go check them out, they’re awesome.
Oh and a special mention to David Gallagher from the BBC for having the prettiest slides. It was nice to see someone put some effort in (take note Dog). Pity he was a designer and I had some uber-techy questions about the iPlayer. But I still really enjoyed his talk, and even though I’m dev rather than design I took a lot away from how he ran their user review process.
A client copies & pastes from MS Word into your CMS and suddenly your lovingly hand-crafted HTML is full of stinking, invalid, word generated, formatting code. Eww.
The only way to avoid the problem was to paste the text into a plain text editor (like notepad.exe), copy the text from the plain text editor, and then paste the text again, this time into your CMS. A bit crappy.
With Suds just copy the text from Word, click the suds icon and bam, nice clean text ready to be pasted into your CMS. Awesome.
Trying to decide between best tool for the job, or something a bit easier for non-techies. I’ve had an idea for a little project (yes, another one) but the software I want to use, Sphinx is not common on Shared Hosts. I’ll be releasing it as an Open source project, but the lack of shared host support will limit the perspective user-base.
Only those with a minimum of VPS and enough know-how to install and configure Sphinx will be able to run it. Not exactly the casual user market. But then again I don’t even know if the project will appeal to casual users, or anyone so it has to be fun to build for me, and I kinda wanna learn how to use Sphinx. Plus anytime I’ve had to roll-my-own search, its sucked. This should help me on my work projects from now on as well.
I’ve still got 2 personal projects to get out the door first so I have time to decide…
I think the problem is I use email mostly for work, and people don’t tend to sign up to social networks with their professional email account, so it has a hard time finding them.
Also it doesn’t recognise the fact I have multiple email accounts. I can send email as half a dozen different email addresses from a single gmail account. Yet I can only manage the Rapportive account for my primary address. There should be someway to link these accounts or preferably edit the information for each in a central location. Or maybe there is and I’m just missing it?
These are just a few of the things I currently have on my list to learn. Some I will have already started little projects in as I always learn best by doing, others are still on my todo someday list.
This list isn’t static either. It changes constantly. For example 6 months ago Ruby would have been on there, but recently I decided I’d rather learn Python instead. That doesn’t mean I might not change my mind and add it back onto the list in another 6 months. As I said it changes constantly.
I am constantly learning. There is always some new technology or framework I want to get to grips with. I do this for pleasure! I enjoy it and spend my own free-time doing so. Yet I would constantly skip school, or sleep during class. What a difference being interested in your subject makes.
Extending my dbSQLite class so that it functions as a Key:Value store. Perfect for storing user preferences, although I was a bit peeved to discover that SQLite doesn’t support ON DUPLICATE KEY UPDATE, hence the get within the set method.
For volatile storage just set this.db to null (line 6)