Saturday, August 30, 2008

Test Driving the new Thunderbird3 alpha2 (Shredder) on Linux


Feeling a little adventurous today? Want to get your hands dirty on something cool and really alpha. Here is something for you all.
The nice folks at mozilla spunoff Thunderbird into a separate project called Mozilla Messaging with a prime objective of developing Thunderbird3 (codenamed Shredder), the next generation open-source email/messaging client.

Some of the notable enhancements in Thunderbird 3 would be the integration of Lightening (a calendar extension), improved search and some configuration and user interface improvements.

A gentle warning, it is not meant for production environments. Let us start, shall we

1. Download the linux version of Shredder alpha2.

2. Extract the contents to a local folder (in my case Desktop).

3. Browse to the folder and Run the application by either from shell (as below) or by double clicking on a binary file called "thunderbird".



$ ~/Desktop/thunderbird/thunderbird


4. Thunderbird launches with a welcome screen, where you can configure your accounts (in my case, Gmail)


5. Like Thunderbird2, setting up Gmail accounts is a breeze. Enter your username and password, and you are done. Note: it sets up pop access.


6. You can specify all your account settings like before.



I wasn't expecting any new features as Shredder alpha2 is meant to test the transition to the latest gecko 1.9 engine, on which firefox3 is also based upon. Play around and if you do find bugs, report it back.
Well, that's it, happy alpha testing and keep supporting the mozilla foundation.

Friday, August 08, 2008

Google Insights: Finding Paris for Danes


Numbers intrigue me a lot. For instance, did you know, more Danes have searched for Paris Hilton, than anywhere in the world in the last 30 days(including the Americas). And did you also know, Obama has a huge fan following in Kenya.
There was a time when collecting statistics such as ..these would have taken months and even years. But now, by just observing user browsing and searching patterns, a lot of really interesting and sometimes useful information can be collected in no time at all.

We discussed Twist, few months back which analyzes Twitter posts. But, then how many people really use twitter? What is it that almost the entire world uses, more frequently than anything else? This might help you.



If you answered "Google", you are almost right. Though lately, there has been a huge shift towards social networking - Facebook and Myspace and stuff. But, one has to admit, Google knows more about people, than anyone else.

Google seems to have spun off "trends" as "Insights for Search". Insights analyzes user search behavior over the last 4 years across regions, cities and time lines. A statutory warning here: It's really quite addictive.

For instance, as a Linux enthusiast based in Singapore, I was disheartened by the fact that the awareness and interest for Linux has been falling continuously over the last 4 years. With this statistics, I know we've got to pull up our socks and try harder.



Another example could be, as a Microsoft X360 sales guy in Singapore, this following consumer trend is a warning bell to cut prices, increase awareness and throw in more promotions.



I could go on and on, but Google Insights is an amazing tool. It doesn't guarantee accuracy but paints a really interesting picture of what's buzzing the tubes. I only wish there was a way to filter positives from the negative "insights". Some things top the list for all the wrong reasons. But yeah, in future, context based searching will hopefully resolve this issue. Better not to discuss Cuil here :)

Insights showcases the amount of information Google collects. It also reinstates the reason why they are the biggest targeted online advertising agency. And it also reaffirms my faith in Google, that if there is anything to be searched online, Google will find it for here. God speed my friend and keep the numbers flowing :)

Friday, August 01, 2008

Cloud Computing: What it means for Enterprises?


Data is the single most important asset for any modern enterprise. Transaction histories, customer records, asset inventory lists, intellectual property, etc. Even the modern currency is just bits of 1s and 0s flowing through secure networks.

Data centers are the lifelines of such enterprises and companies spend millions, and sometimes billions to manage and run such data processing centers. Lately, due to rising costs in terms of maintenance, capital expenditures, enterprise licenses, power etc, companies are looking for cost effective alternatives. Analysts predict, by 2012 the power costs for running data centers will jump 13 folds. 60% of power consumed for cooling these centers is wasted due to inefficiency and there is a huge impetus to go green and be more power efficient in the long run.

A common technique followed by modern organizations to control rising costs is to consolidate a large number of scattered server farms, network rooms, communication centers and data centers around regions to smaller and centralized data centers. Also known as data center consolidation, this is a key driver behind reducing costs and optimizing existing resource by improving the utilization rates. Virtualization solutions like Vmware have also enabled better utilization rates by running multiple Os' and applications on a single server.

Cloud Computing views technology resources and infrastructure as "always on" services, where customers can tap into this vast pool to run their own applications and services. From the customer/end-user’s perspective, they only need to care about the subscription fees charged by the service provider. The Service Level Agreements (SLAs) ensure a minimum level of service quality and support.




Companies like IBM, Google, Amazon and Yahoo are the early promoters of this new phase in computing. Google which invested more than a billion USDs in Capital expenditure last year offers the Google App Engine service whereas Amazons S3 is quite popular among startups and SMEs who can’t afford to run and manage their own data centers.

Cloud computing in some sense, is a mirror of data center consolidation initiatives. Using Economies of Scale, by hosting applications from thousands of customers in centralized data centers, hosts increase the utilization rates of their existing systems, can negotiate with vendors for larger discounts, forecast energy requirements and can have a small set of on-site trained workforce along with a cheap outsourced support staff.

An important question that arises now is what keeps a modern day enterprise to shy away from cloud computing? In fact, isn’t reducing costs the prime focus of companies in such times of recession and economic slowdown?

Well, the answer is that it’s not that straight forward as it seems. Firstly, is it safe to trust a 3rd party to take care of your sensitive business data? Will your customers agree with it? What about regulatory requirements? Countries like Japan and Korea discourage the practice of storing local production data on international servers.

Moreover, some critical business applications can suffer from unacceptable latency issues due to the proximity and bandwidth connecting to the "cloud" network. In such cases, you have to fallback to your existing infrastructure resource or search for other "clouds" that would satisfy your latency requirements.

To sum up, Cloud computing seems like the right way forward. It's takes away the pain that comes with managing your own infrastructure. Along with the major technology breakthroughs, legal processes and controls are also needed in place before it can be accepted into mainstream. Service providers will have to work harder to build the trust amongst customers and guarantee a resilient, secure and stable infrastructure solution.

A lot of work is still needed but at least we are headed in the right direction. 2012 – Era of cloud Computing? Only Time will tell…