And it is over! We completed Magnet’s first HackFest at 5pm today. A bunch of hackers came together on 23rd December at 5pm, and hacked internals of systems, coded cool ideas, researched on future trends, watched movies, ate pizzas and vada pavs, goofed around, and had a tremendous experience in 24 hours.
As I already wrote yesterday, we settled in, brainstormed about ideas, and then played Quake. It was 9pm by then. Some had started working on their ideas, some were interested in playing more. We ordered pizzas. The pizza guy came in on the 27th minute (30 minutes, nahi to free, right?). Asked the pizza guy about who pays the bill if he gets late. Have seen and heard in the movies that the pizza guy has to take the loss. He said the company pays. The pizzas were good, and we were stuffed by the time we finished half of what we ordered.
That was right time for a movie. Juggled with a few options and finally decided to watch Resident Evil. Watched the first part, and half of the hackers were already on their computers by then. Some even watched the second part. It was getting pretty exciting by that time.
My first hack task was over, thanks to the automatic fix on the popularity contest plugin for WordPress. So I started looking around for other things. Wanted to try out Photoshop CS3, so kept that on download. Guess when I finished the download? No, not in the morning, it’s still not done! The Adobe Labs’ download process is a little finicky. For one, it requires you to login to download the beta. And then it would not work with wget / other download manager. I am on a Mac, and I started the download in Firefox. Now Firefox has this big annoying problem with downloads. If you loose connection in between, or pause the download, sometimes you have to start it all over again. I did not want to put a 685MB download on Firefox, but then I did not have wget on the Mac. So I said let me start the download, and download and compile wget in the meanwhile. I had been delaying it for quite some time anways. So I downloaded wget source code and went to compile it. Realized there was no gcc on the machine yet! Got the Mac OS X DVDs and installed the developer tools and gcc. The compile would still now work and failed with the error message: “C compiler cannot create executables”.
Now that was something I got for the first time! A little bit digging around, and reinstalling the packages did not solve the problem. Installed gcc-4 instead of 3.3 and then it was ready to be compiled. But by this time, I found out that though wget does not come with the Mac anymore, there is cURL. And I can do more things with cURL. So tried to hook up my CS3 download on cURL, but it wouldn’t go ahead without a login. Gave up on the attempts as the download was progressing well.
Started looking at doing something to show related contents from other blogs when you are seeing a blog entry. I already have a “related posts” plugin to WordPress, and thought it would be good to have links to other blogs who are related. Naveen did some research and found out Sphere – it does the same thing. Albeit with an unfriendly interface. And then I found out that Google too has a “related links” provision. Had seen some wonderful visualization done on the principle though. And I think this is still an open idea – visualizing the related content – similar to what LivePlasma does for CNet news via big picture.
I also wanted to solve some long pending problems during the hackfest. So took on doing something to manage the software updates for the Macs we have in office. We have 3 MacBooks, a PowerBook, a Mac Mini, and an iMac. I wanted to have a system for software update for the MacBook, so that we don’t have to download the updates individually on each laptop. Now if you know the Software Update process on the Mac, it’s pretty much automatic. There are not real options on the interface to point to an update repository. I had done some research on this earlier and knew that the Mac OS X Server has a feature to update all Macs from a single update server. We did not have the OS X Server, so I had to do something else. A lot of reading and some Google later, I found two things – the Software Update Enabler which allows me to point to a URL of my choice for getting the updates, and a shell script – sumirror – that can mirror the Software Update repository one machine for you. This was very much what we wanted, so I started working on it. A few hours of hacking, and we now have a solution that works for us. I modified the shell script to detect the requests made by the client, and which were not found (via 404 messages in httpd.log). Then took that list and got the IDs of the packages that were queried for. Remember, we did not want to download anything extra – the Mac software and updates are very heavy and I don’t want to kill my bandwidth on them. So I made it pick up only those software which the client checked for. After getting this list, it was a bit of processing on the software update catalog file to get the correct URLs to download the files, not downloading languages other than English, and testing things out. I am very happy that this worked, I wanted to do it for quite some time. I will write up a detailed note on this in the coming days, so if you want to do it, you can do it easily.
Did a lot of other research while all this was going on. I am actually thrilled that I learnt so many new things. Here are a few things I tried / read:
- SpeedDownload download manager for Mac. I am happy with cURL.
- Using Lucene to find related content
- Adobe Apollo – cross platform rich internet application development
- Read a few things about Apple internals at the Developer Connection
- Mash ups and APIs at ProgrammableWeb. Worth mentioning is Attendr (on the lines of Glancer) and liked Beam Me Up Hottie 😉
- All the Google APIs, and especially the Google Calendar API. Vinay and Mohan are creating something based on that and the Flex Scheduling Framework now
- Got Firefox 2, which still does not fix the download manager problems
- Read about Yahoo!’s HackDay and Gutentag
- Watched Iain Lamb’s “The New Hacker’s Toolkit“
- Also watched Bounding by Pixar this morning!
- Bad that Orkut does not have any APIs now! There would be so much created if there were
- Downloading Parallels build 3306 beta now
- Learnt an easy way to send screen captures instead of web camera on a conference using Flash Media Server
- A little bit on Second Life
- The 2007 Web Predictions from Read/Write Web
- The MyBlogLog widgets – pretty interesting they are
- Am also excited about Spry and other things on Adobe Labs
- Tried getting Ethereal on the Mac, didn’t work – don’t want all GTK libraries on Mac yet
- Looked around HexEdit, setup Adobe Reader for Mac so that I can use digital signatures
- Looked at SearchMash – and the idea looks good
- Oh, and even installed CrossOver to run Windows apps
And then probably a few other things! I had downloaded the Photoshop CS3 setup on our server, using an SSH and lynx, and was downloading the balance 163mb after the connection broke. And a few minutes ago, the download completed. And vanished! I can’t find the file anywhere now. Will have to download the whole thing again now! This is really a pain.
Apart from that, it’s been a great hackfest! Vishal is working on profiling database queries for optimization, Kartik is working on having festival speak in Gujarati. Arun worked on shutting off all the servers from a single machine, and is looking into fax support for Asterisk. And I just heard some noises from Vinay and Mohan, looks like they got something done!
And here is an interesting panorama that Ameya took. Vinay, Arun and Mohan interviewed me and Kartik – about programming, hacking and general questions. This photo is of that time.
Webmentions
[…] * I hardly sleep for 15 minutes in front of my LCD during event. Pizza was good and thanks to bigbazaar! Thanks to everyone for participation. We are planning to have next HackFest in starting of Feb. See also Nirav’s post about this. […]
[…] had another Hackfest in Magnet from yesterday. The last Hackfest was a roaring success. We had only five participants this time, but it was fun as well. For me, it was a “Samosa to […]