Cisco Systems is bolstering its unified communications and collaboration portfolio with the purchase of a corporate instant-messaging company.
On Friday, the networking giant announced it will purchase Jabber, which uses an open-source IM and presence protocol used by Google Talk and Gizmo. The company didn't disclose financial details.
In essence, Jabber's technology allows multiple IM platforms to "talk" to each other. This means that it allows people using tools such as Microsoft Office Communications Server, IBM Sametime, AOL AIM, and Google to send messages to each and get presence information about one another.
Jabber's technology is already used by some large companies including AT&T, BT, EarthLink, FedEx, and JP Morgan.
The deal will help Cisco compete even more aggressively against Microsoft in the unified communications market . And it will fit nicely with some of Cisco's previous acquisitions including the purchase of Web conferencing company WebEx .
Cisco expects to close the deal in the first half of 2009. Jabber's 50 employees will join Cisco's Collaboration Software Group.
Stylish , a Firefox extension that lets you make big changes to other people's Web sites with minimal effort, enables one of the cooler Gmail re-skin jobs I've seen. For people who like drumsticks, instead of Gmail's boring, yet supple thigh meat, installing a Stylish plug-in named simply " Gmail Redesigned " lets you turn Gmail's exterior into a gradient and plastic button-filled playground. The best part is that it retains its speed, button placement, and all around "Gmailness" you're grown to love.
Besides your in-box, the add-on skins the compose page, the Google Talk side bar, and entire conversation strings. This is one thing actually improved in the translation, as the color-coding of the conversations makes it easier to parse through multicontact communications.
The only problems I ran into were small visual quirks. For example, in-box media manager Xoopit works just fine, but retains its old-school Google look and thus sticks out like a sore thumb. I'm assuming any other Gmail add-ons that haven't been integrated into the makeshift style sheet will experience the same thing until special bits of CSS are included to skin them too.
To get going just install the Stylish plug-in here , then restart Firefox. Once you're back up and running, click the download button on this page and enable the new look from the plug-in options menu under Tools --> Add-ons . When you return to Gmail it will be dark and mysterious.
Gmail Redesigned lets you skin Gmail to look dark and mysterious while retaining all of its speed and menus.
Google's Webmaster Central has become a very important resource for anyone who has a Web site, works on a Web site, or, like SEO practitioners, helps others with their Web sites.
Google continues to roll out more features and better functionality to existing features, and now they just did a little bit of both with the addition of their Generate robots.txt function.
Google had previously added a robots.txt analyzer, which at this point is still the more useful of the two tools. For those who aren't aware, the robots exclusion protocol helps with instructing search engines how to interact with a Web site. There are a number of directives available, but the main purpose of the robots.txt file is to instruct the search engines about content that a site owner doesn't want the robots to crawl.
Why in the world would you not want search engines to crawl any of your content? You may have content that, for whatever reason, you don't want others to find through search results. Note, however, that this is not the same as secure information that requires authentication through a log-in.
Your site may have its own search function that creates "search results" for your site. Search engines generally do not want to include search results within search results, so this content may not be returned for searches on the engines anyway, so you might want to focus the crawlers elsewhere for greater crawler efficiency.
Or you may have duplicate content issues that you could use robots.txt to filter out. This is especially common with a content management system that creates a separate printer-friendly page.
Regardless of your specific needs, having a robots.txt file can be important to a site. Rarely is there a site that can't benefit from disallowing at least some content. Even if you have nothing to disallow, you may want to take advantage of the auto-discovery feature for your XML sitemap. Finally, depending on your server log system or analytics package, not having a robots.txt file can be problematic if it inflates your "404 File Not Found" error reporting, which can happen because search engine spiders will request the robots.txt file automatically when they come to your site.
Right now, the robots.txt generator is rather basic and I hope that Google will add more features to it going forward. Currently, site owners have to paste in URLs and URL patterns to build the file. It would be great if it would provide a list of URLs or patterns extracted from a site to help automate the procedure for anyone not familiar with the protocol.
There is more information about the protocol, though a bit more on the technical side, at the robotstxt.org site and you can find more engine specific information on crawling and robots.txt from Google , Yahoo , MSN , and Ask.com .
One important tip is that the following directive tells all spiders they are allowed to go anywhere:User-agent: * Disallow:
And, more importantly, the following directive, which I sometimes see when I think people really wanted the above:User-agent: * Disallow: /
The latter tells the spiders to stay out of the entire site--clearly two very different results, so be sure you understand which does what.
I was poring through a university research paper Tuesday afternoon on the connection between the use of corn-based ethanol in the U.S. and greenhouse gas levels. That was just a grim appetizer for the big eco-news du jour later in the afternoon.
Turns out that Riau, Sumatra, a province in Indonesia, has the dubious honor of producing more average annual greenhouse gas emissions "from deforestation, forest degradation, peat decomposition, and peat fires between 1990 and 2007" than does the Netherlands. That's due to the local practice of supplying global paper giants and palm oil plantation with raw materials processed from forests and peat swamps.
Because of the ongoing forest clearance projects in areas with deep peat soils, experts warn that the region's carbon emissions will likely climb.
The report was jointly published under the auspices of Hokkaido University, the World Wildlife Fund, and Remote Sensing Solutions GmbH.
The researchers painted a sober picture of the changes wrought by deforestation. Here's the link to the full report .WWF