Tuesday, October 21, 2008

ASUS CEO: Windows 7 touchscreen Eee PCs in mid-2009, $250 model on the way

We heard the rumor now ASUS' CEO, Jerry Shen, confirms in an interview with Laptop that ASUS will slap a touchscreen and Windows 7 into a new Eee PC sometime in the second half of 2009. A statement likely to make a few project managers at Microsoft uncomfortable as that cuts into the official early 2010 padding built into their Win7 Gantt charts. The touch-enabled Eee PC model(s) could come in the form of a convertible tablet although Shen wouldn't specify -- he only promised more details in Q1, presumably at CES. Unsurprisingly, ASUS has no plans to put Vista onto Eee PCs at all. Also noteworthy is the introduction of "more exciting" Eee PC rigs in Q1 and Q2 in prices ranging from $250 (yes, $250) to $700. Steadily inching closer to that ellusive $199 Eee PC.

Other interesting points from the interview:
  • Eee Top all-in-one PC will be released at the end of this month
  • EeeStick (and compatible games) is to be released soon as both an Eee PC bundle and as a separate accessory (depending upon country) priced somewhere between $50 and $100
  • Two new "Eee products" (not Eee PCs if we read this correctly) will be announced in January
  • Eee PC devices will be limited to 10-inch, and smaller displays -- Shen describes the netbook as a platform to consume content whereas a laptop is for creating content
  • ASUS is focused on improving battery life and startup times on future Eee PCs -- adding more power, like dual-core Atoms, is not a priority

  • Resource - Engadget

Texas Instruments Plans to Cut 650 Jobs

Texas Instruments, which makes chips for cellphones and other devices, warned on Monday that orders were slowing rapidly and said it would cut jobs to save money.

The company said it would reduce its work force by 650 in six countries from a unit that makes chips for cellphones. And it is in talks to sell part of the unit, which has struggled because of a drop in sales at a big client, Motorola.

The company essentially met analyst expectations for third-quarter earnings on Monday, but the chief executive, Richard K. Templeton, said order trends have been weak in the last few months.

Texas Instruments said it expected sales of $2.83 billion to $3.07 billion in the fourth quarter, well below the $3.3 billion that analysts polled by Thomson Reuters were expecting. In the fourth quarter last year, sales were $3.56 billion.

The company expects 30 cents to 36 cents a share in fourth-quarter earnings, while analysts were expecting 44 cents.

There have been other indications that the fourth quarter would be a bad one for semiconductor companies; last week, a rival, Linear Technology, warned of a sales drop.

Investors, however, seemed taken aback by Texas Instruments’ warning. Its shares were down 93 cents, or 5 percent, at $17.05 in extended trading Monday, after the release of the results.

For the third quarter, the company earned $563 million, or 43 cents a share, down 27 percent, from $776 million, or 54 cents a share, in the same quarter last year. Excluding earnings from discontinued operations in the year-ago period of $18 million, income from continuing operations a year ago were $758 million, or 52 cents a share.

Revenue declined 8 percent to $3.39 billion.

Analysts were expecting earnings of 44 cents a share on sales of $3.4 billion. However, their estimates generally exclude one-time charges and benefits. Such items reduced earnings in the quarter by $10 million, or nearly a penny a share.

The job cuts will come from the division that makes so-called baseband chips, which beam a phone call to a cell tower. The cuts will save $200 million a year, the company said.

The unit to be sold is the one that supplies off-the-shelf baseband processors to smaller manufacturers.

Of the 650 layoffs, 350 will be in this unit, said Ron Slaymaker, the vice president for investor relations.

The company has about 30,000 employees.

Resource - The New York Times

Two new HP devices revealed

HP’s mysterious new smartphone entrants have been revealed, and certainly look stylish enough. Apparently the result of bringing the design in-house the HP Ipaq VoiceMessenger and DataManager feature unique looks.

HP Ipaq DataManager

The DataManager is a Windows Mobile 6.1 Professional device which features a slide-out keyboard, 2.8 inch QVGA screen (really?), WIFI, Bluetooth, Quadband GSM, Tri-band 7.2 Mbps HSDPA (European bands), HSUPA, A-GPS and a 3.1 Megapixel camera with LED flash. It has 128 MB RAM and 256 MB ROM, microSD expansion, with a 1140 mAh battery.

The smartphone is 160g and 5.7 x 1.74 x 11.4 cm, and has a 2.5mm headphone jack. Opera Mobile is bundled, as is Google maps mobile and 30 Day Trial of Webraska Turn-by-Turn Navigation.

HP Ipaq VoiceMessenger

The HP Ipaq Voice Messenger has similar specs to the DataManager, but instead of the slide-out keyboard features a 20 key suretype-style keyboard. The non-touchscreen Windows Mobile 6.1 Standard device has a 2.4 inch screen is which is regrettably also QVGA. The connectivity and GPS receiver is the same, but the device is much lighter at 107 g and measures 5.0 x 1.36 x 11.4 cm. Despite this it features a larger 1260 mAh Lithium-Polymer battery.

Both devices feature micro-USB ports for synchronization. Both handsets also feature a new optical sensor that replaces the five-way navigator key common to many phone designs. This makes it easier for users to move through menus with greater speed while minimising the chances of mechanical failure, according to Neil Dagger, HP’s iPaq and wireless business manager. “You just glide your thumb over the disk, and you glide though the menus, tapping it when you want to select something,” he said.

Both models also have a power saving mode that minimises power consumption if the battery charge level starts to get low while the user is out on the road.

The two models will be available either SIM-free through HP resellers, or with a wireless contract through Vodafone Europe-wide..

HP said that the iPaq Voice Messenger is aimed at users that want a phone for voice calls first, but who may also need to receive emails while on the go.

David Wright, vice president and general manager of HP’s personal systems group, said the new smartphones were designed to bridge the gap between work and play, targeting what the company calls “prosumers”, or professional consumers, and small and medium businesses.

The iPaq Voice Messenger costs £333 (SIM-free) and is due to ship in mid-November, while the iPaq Data Messenger costs £399 (SIM-free) and will be available from the end of November.

Vodafone pricing has yet to be disclosed, but the operator is likely to offer the devices free depending on the monthly tariff chosen.


Resource - WMPowerUser

Yahoo Layoffs Expected to Hit This Week

Yahoo started out the year with layoffs, and it is going to end the year with more. The layoffs have been expected ever since Yahoo hired hatchet men from Bain & Co. to come help with the downsizing. The exact number of layoffs is still not known—between 1,000 and 3,000 are the numbers being discussed. During its earnings conference call on Tuesday, Yahoo is expected to announce how many people it will let go. In addition, operating budgets across the board are expected to be cut 15 percent.

In January, Yahoo laid off 1,000 people, and all year it has been suffering from a major drain of talent. But it still has plenty of employees—about 14,000. Getting rid of another 10 percent or so would have a meaningful financial impact by lowering expenses, but it would also lower morale.

Not that Jerry Yang has much choice these days. We’ll update the layoff tracker when the final numbers are known.


Resource - TechCrunch

MINI E finally official, 500 available soon for US test drivers


The much-anticipated, well-expected MINI E -- the first all-electric MINI -- is headed our way before you know it. BMW's built itself a decent performer, offering 204 hp of electric motor in a setup quite similar to the Tesla. The car boasts a 150 mile range off its 35 kWh lithium-ion battery pack, can hit 62 mph in 8.5 seconds, and does a full charge off of an included high current charging station in a mere 2.5 hours. There's naturally a regenerative braking system on board to help beef up the battery in city driving. BMW plans on leasing 500 of these to commercial and private customers in California, New York and New Jersey sometime early 2009, and Europe might get a crack at the car soon after that. No word yet on when we'll see this car ready for the masses, but perhaps we'll get more info when the MINI E makes its "debut" at the LA Auto Show next month.

Resource - engadget

Motorola Readies Its Own Android Social Smartphone

http://images.businessweek.com/story/08/370/1018_moto_android.jpg

Motorola

As the wireless world awaits the Oct. 22 debut of the first phone based on the Google-backed Android software, engineers at Motorola (MOT) are hard at work on their own Android handset. Motorola's version will boast an iPhone-like touch screen, a slide-out qwerty keyboard, and a host of social-network-friendly features, BusinessWeek.com has learned.

Motorola has been showing spec sheets and images of the phone to carriers around the world in the past two months and is likely to introduce the handset in the U.S. sometime in the second quarter of 2009, according to people familiar with Motorola's plans. Building a phone based on the highly anticipated Android operating system is part of Motorola's effort to revive a loss-making handset division that has forfeited market share amid a drought of bestselling phones. Motorola stock, which on Oct. 17 rose a penny to 5.62, is hovering near a 16-year low.

The phone will appear among a new class of social smartphones designed to make it easy for users to connect quickly and easily to mobile social networks such as Facebook and News Corp.'s (NWS) MySpace (BusinessWeek, 10/10/08). Such phones let users message in-network friends directly from phone contact lists, for example. A Facebook representative declined to comment on the company's work with Motorola. MySpace.com didn't respond to a request for comment.

Motorola declined to elaborate on its plans, but said in a statement: "We're excited about the innovation possibilities on Android and look forward to delivering great products in partnership with Google (GOOG)" and the community of developers known as the Open Handset Alliance that are working on the Android operating system.

Mobile Networking Wave

In the next year, social networking phones are expected to be a hit with the 16- to 34-year-old crowd, analysts say. According to consultancy Informa (INF), the number of mobile social-networking users will rise from 2.3% of global cell-phone users at the end of 2007 to as many as 23% of all mobile users by the end of 2012.

The Android handset will feature a touch screen about the size of those on Apple's (AAPL) iPhone, people familiar with the phone say. While it takes some of the design cues from Krave ZN4, the first touch-screen phone from Motorola launched with Verizon Wireless on Oct. 14, it's not certain whether the Android phone screen will feature Krave's distinctive and interactive clear flip screen.

Like the world's first Android phone, from HTC, Motorola's Android-based device will offer a slide-out Qwerty keyboard. People who've seen the pictures and spec sheets for the device say it looks like a higher-end version of the HTC phone, called the T-Mobile G1. But it's expected to sell for less, at prices similar to the Krave, which is available for $150 with a two-year contract. After carrier subsidies, the G1 will retail for $180 with a two-year contract.

Slow Off the Mark

Motorola's new phone likely won't be ready to launch in the U.S.

Resource - BusinessWeek

Java support to Appengine to counter Microsoft’s cloud initiatives, Microsoft Strata?

n the recent event Google Developers Day, Bangalore, the Keynote speaker Prasad Ram said that Google Appengine will now support Java.

Some people believed that supporting a static programming language like Java on the platform that supports dynamic language Python wouldn’t easy. But Google clearly has the infrastructure and the back-end architecture that can support static languages too. “Java”, said the other speaker, “was chosen based on the community feedback”. Apparently, many people wanted to build web-apps using Java.

The recent Google code event, Code Jam had over 10,000 participants across the world of which nearly 45% people used Cpp and 25% used Java. Python and C# were used by nearly 10% users each- (Statistics of the Preliminary round). Clearly Cpp seemed most popular language, almost twice that of Java. What do you think is the reason there is no correlation between the ‘popularity of programming languages’ used by ‘number of people who want to use Appengine’ and ‘number of People who participated in the Google Code Jam’. Perhaps because, Java has more web-development libraries and programmers.

Meanwhile Microsoft is going to announce its own version of the entire cloud system- The .Net platform, the SDK (Visual Studio support), the hosting infrastructure that has been in a secret project named “Microsoft Strata” in the PDC (Oct 27-30) - So suspects the blogosphere. This move of M$FT definitely will be welcomed by the people who work on .Net, but do you think it will be a competition to Google Appengine? Garret Rogers thinks, “No”. But, has he made enough reading on Microsoft Strata? The release of Appengine Java support, if made along with the Android SDK on Oct 22, it seems timely.

Resource - Control Enter.in

New MacBook Pro: now with 20% less battery power

Apple touted some pretty decent battery life numbers at the new MacBook / MacBook Pro press event this week: up to five hours on the Pro with discrete graphics off, and four hours with it on (both surely assuming ideal low-power conditions). One of the things that didn’t come up at the presser, however, was that new MacBook Pro batteries actually have just under 20% less energy than their predecessors. While both kinds of MBP batteries are 10.8v, the old ones are rated at 5600mAh / 60Wh, while the new ones are rated at 4700mAh / 50Wh. (MacBook numbers updated below.)

I’d estimate that the integrated NVIDIA chipset and ever more behind-the-scenes power-saving techniques are why Apple is claiming such solid life despite killing a fifth of the machine’s energy supply — but a 20% reduction is still no small number. It also means that as your new MBP’s battery degrades, you’ll have a smaller pool of potential energy to rely on, meaning you could wind up having to replace your battery more often (although that’s a little conjectural, at this point).

But as some are now postulating, one technique Apple may now be employing to save power is making use of the machine’s GPU(s) to accelerate video playback. We already know that the new MacBooks have a different build of OS X than older gen machines (9F2114), but one thing I also noticed is that Quicktime, the engine behind Apple’s video encodes and decodes, was also revved in the new machines, now clocked in at 7.5.5 995.23.3 up from the last machines’ 990.7. With a little luck, perhaps video encodes will finally be hardware accelerated, too. (Will have to test that one later!)

Update: Matt at Gizmodo mentions that the numbers add up, when comparing the extra 20% battery life you get in new models using the integrated GPU vs. that same five hours claimed in old models using discrete graphics. Also, new MacBooks have also decreased battery capacity (as expected), from 55Wh in last-gen models to 45Wh.

Resource - Ryan Block

The TechCrunch Layoff Tracker

After Silicon Valley woke up to the economic crisis last week and VCs rang the alarm bells, startups are starting to heed the call and tighten their belts for a long winter. As the slide above from Sequoia Capital illustrates, belt-tightening now may be the most prudent thing a startup can do. While nobody likes layoffs, least of all the employees losing their jobs, it increases a startup’s chance for survival by reducing how much cash they burn each month.

This week alone, we’ve seen layoffs at Zillow, Pandora, Zivity, AdBrite, Hi5, Jive Software, and Redfin (which laid off 20 people). The week before was Seesmic, and before that eBay. We’re hearing rumblings of more to come.

It’s hard to keep up with it all. So we’ve created a simple Layoff Tracker to keep count. We’ll add layoff data here for tech companies big and small going forward. Hopefully, all the companies on this list will come out stronger on the other end.

If you know of any that have been overlooked, please submit a tip with the name of the company and number of layoffs. If it’s been covered, also send a link to the blog post or news article.

Resource - TechCrucnh 2

Jobs responds to outrage over MacBook's missing FireWire

In one of his characteristically terse email replies, Apple chief executive Steve Jobs has reportedly told one Mac user that changes in video camera technology have reduced the need for FireWire on his company's 13-inch MacBooks.

The one-line response to a fan complaining over the lack of FireWire on the new entry level aluminum MacBooks is blunt but also points out that technology has changed since the company began including FireWire with Macs in 1999.

"Actually, all of the new HD camcorders of the past few years use USB 2," Jobs supposedly wrote in an email, a copy of which was posted to the popular Flickr image sharing website.

Jobs is likely not pleased about the current state of FireWire himself. Apple invented the standard in the late 80s as a hot pluggable replacement for SCSI, with a special emphasis on supporting media streaming with isochronous, real-time data transfers. The company then released the specification through a standards body to become IEEE 1394, where others including DEC, Texas Instruments, and Sony contributed to its development as well.

Upon returning to the beleaguered Apple in 1997, Jobs hoped to earn Apple some licensing royalties from the technology, which was quickly becoming an emerging standard not just to replace SCSI but also in video and music applications. Jobs' plan resulted in Intel offering to upgrade its USB standard to speeds approaching FireWire at a lower cost. The 'master to slave' USB 2.0 protocol was cheaper to implement than the 'smart peers' design of FireWire because USB required less intelligence in the controllers.

Somewhat ironically, Apple's 1998 iMac originated the push behind USB that allowed it to gain rapid adoption among consumers. USB 2.0 built upon that ubiquity to push into the peripheral territory that had been wholly owned by FireWire. In 2001, Apple's iPod began to popularize FireWire as an interface that was much faster for syncing the then relatively large MP3 files compared to existing players that used USB 1.0. However, by 2003, Apple started adding USB 2.0 support to target PC buyers, where FireWire ports were rare. By the end of 2005, Apple had removed FireWire sync from the iPod line as a cost savings measure.

While USB 2.0 ate into the casual peripheral market for consumer hard drives and web cams, FireWire retracted to support applications where USB 2.0 wasn't suitable. It retains clear advantages over USB 2.0 among higher performance hard drives, but in that market, FireWire is now competing against eSATA, which developed from ATA cabling. Historically, FireWire has been the way to import video from digital cameras, but as Jobs' purported email announced that is no longer always the case.

A glance at the product pages for Canon, Hitachi, JVC, Samsung and Sony as well as Amazon's top camcorder list indicates that virtually all new compact consumer HD cameras now use USB 2.0 to transferring footage directly to a computer instead of the FireWire. Some camcorders also offer the option of burning directly to DVD and a few can transfer video over a USB-to-FireWire bridge cable.

Steve Jobs Reply
A purported email reply from Apple chief executive Steve Jobs.


That reality is little comfort to those who fall outside of Apple's market for the new entry-level portables, many of whom are vocal in their opinions in Apple's support discussions as well as AppleInsider's own forums.

Support for older cameras, many of which (particularly DV tape models) depend on FireWire, is ruled out by Apple's aluminum MacBook update; so too are prosumer cameras such as Sony's HDR-FX1000, which needs the faster throughput of FireWire (called i.LINK by Sony) to deliver raw content if a card reader isn't used. To serious amateurs or professionals who prefer a smaller system, the loss of FireWire on the new entry level MacBooks is a vexing problem.

"I am a video producer and use my MacBook on site to ingest footage taken from FireWire cameras, even occasionally hooking the camera right up to the MacBook," says one Mac user with the previous generation system. "Well, it looks like there isn't a FireWire port on it anymore... how the heck am I supposed to do that? I am sure I am not the only one with this concern."

Professional musicians also use FireWire in recording equipment. Others have noted that the lack of FireWire additionally rules out Target Disk Mode for managing files or cloning systems, as USB 2.0's architecture lacks the capacity to support that feature. Apple's Migration Assistant software now alternatively supports importing files from another machine over Ethernet, from USB drives, or Time Machine backups, however.

Even so, many argue that Apple's move appears built to upsell any serious user to the MacBook Pro, which starts at $800 more than the entry level new MacBook, despite the fact that Apple continues to sell the previous-generation white MacBook, with FireWire intact, for $300 less than the new aluminum MacBooks.

There's no doubt that the removal of FireWire from the MacBook was as difficult of a decision for Apple as it is a mourned loss for many Mac users. With FireWire increasingly receding into the professional space, Apple had to weigh several variables, including the cost of incorporating another port to its entry level laptop that many of its new users wouldn't even recognize. After all, half of the buyers Apple is selling to in its retail stores are new to the Mac. Being able to offer them a lower price will likely help more than trying to sell them on the concept of Target Disk mode, which is entirely foreign to PC users.

The future of FireWire is still up in the air. Apple retained the FW800 version (running at 800Mbps, twice the speed of the original specification) on the new MacBook Pro, providing substantially faster throughput than USB 2.0. On the MacBook, FW400 doesn't offer most users enough of an advantage over USB 2.0 to warrant taking up the limited space on the port panel and on the logic board.

"Many of us don't have great confidence that FireWire is here to stay on MacBook Pro, Mac Pro, or iMac, either," one forum user wrote.

With the advancement of USB 2.0 on the low end, erosion from eSATA among hard drives, and a migration away from FireWire even in its home field advantage among digital video users, Apple is probably wondering the same thing.

Update: Jobs continues serve at times as Apple's unofficial public relations department, and AppleInsider can now nod to the authenticity of the aforementioned email with a high degree of certainty. Since our publication of his original email Thursday, Jobs has since exchanged another pair of emails with David, both of which be seen here:

---------- Forwarded message ----------
From: Steve Jobs
Date: Thu, Oct 16, 2008 at 4:04 PM
Subject: Re: Firewire RIP?
To: Xxxxx

The new HD camcorders start around $500.

Sent from my iPhone


On Oct 16, 2008, at 12:41 PM, Xxxxx wrote:

Hi Steve,

Thanks for the fast response! In answer to your statement, though, I decided to look at the selection of camcorders on BestBuy.com since I believe they represent a pretty average staple of what consumer electronics people are buying. Although you are correct that (almost) all of the new HD camcorders use USB 2.0, there are still many, many standard definition camcorders (read: affordable for average Joes) that require firewire. Does this mean to say that Apple no longer supports average Joes from making home movies on their computers? In other words, if I have a $300 firewire camcorder and a new MacBook, shouldn't I be able to edit videos of my kid's birthday just as easily as someone who has a MacBook Pro and a $1200 HD camcorder?

Sincerely,

-David


Resource - AppleInsider

Coloratura on demand: Met Opera adds HD streaming

Opera tends to be a love it or hate it form of artistic expression; I fall strictly into the latter camp, since I tend to feel the vocals ruin what otherwise might be a fine piece of music. That should provide some useful background for the following statement: I have watched online streaming of opera in HD, and I came away impressed.

Next week, New York City's Metropolitan Opera will launch a service that allows opera buffs to stream performances to any computer with a broadband connection and enough horsepower to handle the HD.

The Met Opera has always been interested in getting its performances out to the opera-loving public that exists far beyond its New York City environs. It has broadcast live performances on PBS stations for years and, two years ago, it started offering HD recordings as well. These recordings, as well as historic, audio-only broadcasts (one dating from 1937), will be made available through the service. The Met plans on adding both archival and new recordings as time goes on. Those performances will be sold for both ad hoc and subscription viewing. A single performance will cost $4-5, while the monthly subscription fees will run in the neighborhood of $15.

"I am delighted that the Met’s incredibly extensive archive of video and audio performances will be so easily accessible to opera lovers everywhere," the Met's Music Director, James Levine, said in a statement. Opera lovers may have easy access to the streams, but that doesn't mean they'll actually be able to do anything with them. The HD video comes in a Flash wrapper, but the recommended system requirements for HD are pretty hefty: multicore processor, a gigabyte of free RAM, and 32 MB of video RAM. Most computers sold within the last two years should fit the bill, but I have no idea of what the Venn diagram of opera lovers and owners of current hardware looks like.

The Met offers the curious a preview site where users can see how well their setup will work out. In my case, additional software was needed before the stream would play, but the service quickly popped up a window that tested the new code before it launched into the preview clip. They were apparently not kidding about their minimum hardware requirements; my 2.3 GHz Core 2 Duo devoted much of its attention to keeping the clip running, and my disk has been thrashing ever since, as most of the programs I use seem to have been pushed into virtual memory. Maybe installing Flash 10 would help here.

The network was even more problematic, as the stream pretty much saturated my DSL connection, resulting in some stuttering and sporadic screen freezes the first time through. A second viewing went completely smoothly, however, suggesting that hitting pause at the start and letting the download run for a while ahead of viewing would eliminate any trouble. What that didn't eliminate was the CPU requirement, which kicked my laptop's fan into an audible spin; not the sort of thing an opera buff is likely to tolerate.

But live opera is as much about the visuals as the sound, and here, the service really impressed. Switching to full screen mode was, in a word, stunning. And remember, I don't really go for this stuff.

Some people clearly do and, with a price that's in line with various movie services, the Met Opera's video streams are likely to find some takers. How many is hard to judge, given that the audience will need a combination of initial interest, hefty hardware, and significant bandwidth.

Still, the fact that we're considering the size of the audience says a lot about the status of online video, which has shot from curiosity to cultural phenomenon in a few very short years. I will ponder that as I hunt down some decent music to listen to, like Beethoven's 7th.

Resource - Ars Technica

Ballmer: Windows 7 is Vista, just a lot better

Windows 7 will be like Windows Vista , but more so, Microsoft CEO Steve Ballmer said Thursday as he defended the first two years of Vista and claimed its successor will be a major release.

InfoWorld Podcast
Top storage trends and IT consolidation strategies
Sponsored by Sony

"[Windows 7], it's Windows Vista, a lot better," said Ballmer during a 45-minute question-and-answer session hosted by a pair of Gartner analysts at the research firm's annual Symposium ITxpo in Orlando, Fla. The interview was later posted as a webcast on the Gartner site .

Ballmer was responding to a question from Gartner's Neil MacDonald , who asked how Microsoft would walk the line between doing too much with Windows 7 -- thus, risking the kind of compatibility problems that plagued Vista early in its career -- and too little, which might give customers an excuse to pass on the upgrade.

"Windows Vista is good, Windows 7 is Windows Vista with clean-up in user interface [and] improvements in performance," Ballmer said. "Look, I'm not encouraging anybody to wait, I'd go ahead and deploy it right away. We didn't have to go in an incompatible direction to make big strides forward."

Ballmer also took exception to the idea that Windows 7 will be a minor release or a spit polish on Vista. "It's a real release," he said, "because it's a lot more work than a minor release. It turns out you can [do] more than just a minor release in what is essentially a two-and-a-half year period of time. There's no reason to do just, quote, a minor release, in two-and-a-half years."

The major-minor release question has plagued Microsoft since shortly after Vista was released, when company executives seemed to say that it planned to update its operating system on an alternating basis, with the major updates -- what Vista was to XP, for example -- every four years, with minor updates in between. By that map, Windows 7 would be a "minor" update, since Vista was "major."

Microsoft itself has given mixed messages about the follow-up to Vista. Many observers have interpreted the fact that Microsoft has been adamant about application and device driver compatibility between Vista and Windows 7 as proof that the latter will be a minor upgrade. But top company officials have increasingly been pressing the "major" button; Ballmer is only the most recent to do so.

On Tuesday, for instance, when Mike Nash, vice president of Windows product management, said Windows 7 was the product's official name, he called the operating system "evolutionary" but still a "significant" advancement. "It is in every way a major effort in design, engineering and innovation," Nash said then.

But even as Ballmer defended Vista's first two years in the market, claiming that it has 180 million users, he seemed to understand that companies might decide to skip the OS and move straight from Windows XP to Windows 7. "If people want to wait, they certainly can," he said, answering MacDonald's question about why users simply shouldn't wait for the new-and-improved Vista, aka Windows 7.

"Look, no Windows release has to have people want to use it right away," Ballmer continued. "At least in this audience, everybody's going to test it. But the fact of the matter is, no one really ever waits." Instead, he argued, most companies constantly refresh a portion of their computer inventory each year, bringing in the newest operating system with that turnover.

InfoWorld Podcast
Top storage trends and IT consolidation strategies
Sponsored by Sony

Windows 7, which Microsoft has said would be out in the latter part of 2009 or early 2010, will debut as an alpha in less than two weeks, when the company hands it to attendees at its Professional Developers Conference (PDC), which opens Oct. 27 in Los Angeles.

It will be the first in what apparently will be a long line of operating systems built on the Vista code base. Today, Ballmer rejected the idea that Microsoft would need to do a "reset" of the client code in the near future. "We can do a lot of innovation for a lot of years on the same code base," he said before acknowledging that how the OS takes advantage of multi-core processors is still an open question.

"We have a lot of enhancements we can do [to the code base]," he said.

Resource - InfoWorld

Top 10 strategic technologies for 2009

Analyst house Gartner has tipped its top 10 strategic technologies and trends for next year, with virtualization, cloud computing and social networking all making the grade.

The analysts believe all 10 technology trends have the potential to have a significant impact on businesses over the next three years. David Cearley, vice president and distinguished analyst, said companies should look at them as "opportunities"--and evaluate where each can add value to their business' services and offerings.

The 10 technologies/trends are:

1. Virtualization
Not just server virtualization but storage and client devices too. For instance, Gartner says virtualization can significantly decrease the cost of holding information by eliminating duplicate copies of data on real storage devices.

2. Cloud computing
The analyst says smaller companies especially can benefit from cloud computing because of built-in elasticity and scalability which can help them grow quickly while also reducing barriers to entry. Gartner also believes there are opportunities here for larger organizations, especially as certain IT functions become less customised.

3. Servers--beyond blades
Servers are evolving in a way that will simplify the provisioning of capacity, according to the analyst, so organizations will be able to track an individual resource type--such as memory or processing power--and replace that as needed, rather than having to pay for all resources every time an upgrade is needed.

4. Web-oriented architectures
Web-centric technologies and standards will continue to affect enterprise computing models, says Gartner--leading to greater use of service-oriented environments in the business over the next five years.

5. Enterprise mash-ups
Quirky Web mash-ups are inspiring businesses to investigate how mash-ups can be added to enterprise systems to help deliver and manage applications. The analyst says application architects and IT leaders should therefore look to further explore enterprise mash-ups.

6. Specialized systems
Heterogeneous server systems are an emerging trend in high performance computing--to cope with the most demanding workloads--where previously a dedicated appliance may have been used. Gartner says it eventually expects this specialized system approach to filter down to the general-purpose computing market as well.

7. Social software and social networking
The analyst says organizations should consider adding a social dimension to a conventional Web site or application--and, to avoid being left behind, should adopt a social platform sooner rather than later or risk seeming "mute" while rivals get talking to communities and customers.

8. Unified communications
Expect massive consolidation in the comms industry as apps shift to common off-the-shelf server and operating systems, says Gartner. This means formerly distinct markets and vendors will converge--so organizations must plan to take account of comms functions being replaced or converged and adjust their own admin teams accordingly.

9. Business intelligence
BI was the top tech in Gartner's 2008 CIO survey and the analyst continues to back its potential for boosting and transforming business performance. It says such tools are particularly valuable in a difficult business environment like the current global credit crunch.

10. Green IT
Companies should think about shifting to more efficient products and processes as environmental scrutiny increases, and cut energy use. Green regulation is on the rise and this especially has the potential to seriously limit how businesses build data centers so organizations should have alternative plans for capacity growth.


Resource -
ZDNet Asia

Benchmarking Flash Player 10 (Updated)

he web collectively got a bit shinier this week with Adobe's release of Flash Player 10. The new version offers designers a compelling set of new features including support for rich 3D visual effects, a new antialiasing engine, an improved drawing API, support for color management, and enhanced support for streaming audio and video content. With this release, Adobe is clearly taking steps to ensure that Flash stays ahead of the curve and won't lose traction in the face of competitive pressure from Silverlight.

Typically, the release of an Internet plug-in—even one as widely used as Flash—is interesting mostly to the developers who write software for the platform, but users may have a lot to look forward to in this release, especially those using Linux or Mac OS X. In addition to strengthening Flash's media and graphics features, Adobe has also labored to improve performance and cross-platform support. Between updates to underlying technologies and some bottleneck sleuthing, Adobe says Flash performance has improved as much as 300 percent for "the rest of us." Ars Technica decided to investigate whether Adobe was merely blowing smoke in our PDFs.

To get some relatively useful numbers and real web experience with the new Flash player, we decided to compare how well the last Flash 9 release for Mac OS X and Linux performed against version 10 on a number of sites. The most significant of these sites is GUIMark, a benchmark that puts various web-based technologies like Flash and Silverlight through their paces, all while providing frame rate averages for the duration of the test. What follows are our results from GUIMarks, our CPU records, and a discussion of the relevant changes in Flash for Mac OS X and Linux; combined, this should provide a clearer picture of whether Flash 10 actually provides significant performance improvements.

Mac OS X

The Mac version shares the general improvements and enhancements Adobe introduced in Flash Player 10, like the aforementioned introduction of 3D Effects and enhancements to drawing APIs and color management. More details on the new features can be found in the release notes for the last beta release.

The most significant improvement for Mac users, however, is briefly described by Tinic Uro, a Flash engineer, on his blog. Traditionally, Flash has not been one of the Mac's strong points. It has never performed well, and a MacBook's fans are almost guaranteed to kick in when loading up just about any Flash advertisement or that game your boss keeps busting you for playing.

As we've reported before, though, Adobe's engineers have discovered a bottleneck in Flash's text rendering. The GUIMark benchmark should be very sensitive to this because it spends more than half of its time testing a single Mac OS X text function: ATSUGetUnjustifiedBounds. As you'll see from our Mac OS X results below, performance has definitely improved in many respects on Mac OS X, though some Arsians suspect that other problems may still exist with Mac OS X's NSPlugin API.

That said, let's take a look at the results of pitting Flash Player 10 against its predecessor on both a 1.6 GHz MacBook Air and a Quad 2.66 GHz Mac Pro. These results were run using the most recent versions of both players, hosted in Safari 3.1.2 on Mac OS X Leopard 10.5.5.

  • Flash 9
    • GUIMark
      • Air: 4 FPS, two separate CPU cores held steady at 103 percent
      • Mac Pro: 9.5 FPS, two separate CPU cores held collectively at 100
    • Hulu video
      • Air: 98 percent CPU
      • Mac Pro: 63 percent CPU
    • YouTube video
      • Air: 74 percent CPU
      • Mac Pro: 45 percent CPU
    • 2advanced.com
      • Air: 20 percent CPU, Menu interaction, popups, audio cues, etc. peaked at peaked at 86
      • Mac Pro: 25 percent CPU, peaked at 75
    • Winterbells
      • Air: 78 percent CPU
      • Mac Pro: 55 percent CPU
  • Flash 10
    • GUIMark Flex3
      • Air: 17 FPS, 108 percent CPU
      • Mac Pro: 28 FPS, 140 percent CPU
    • Hulu Video:
      • Air: 84 percent CPU
      • Mac Pro: 56 percent CPU
    • YouTube
      • Air: 70 percent CPU
      • Mac Pro: 40 percent CPU
    • 2advanced.com
      • Air: 20 percent CPU, peaked at 55
      • Mac Pro: 28 percent CPU, peaked at 66
    • Winterbells
      • Air: 85 percent CPU, gameplay noticeably ramped up to a more challenging speed
      • Mac Pro: 60 percent CPU, gameplay noticeably ramped up to a more challenging speed

As you can see, GUIMark results offer the best backup for Uro's claims that performance on Mac OS X has improved as much as three times. Four frames per second in Flash Player 9 on the MacBook Air, and 9.5 FPS on the Mac Pro, is, quite frankly, pathetic. While 17 FPS on the Air in Flash Player 10 isn't exactly great, it is actually over a 4x boost in performance. 28 FPS for the Mac Pro is right at 3x faster, though the GUIMarks test worked both machines noticeably harder; especially the Mac Pro. Fortunately, our CPUs were worked at least marginally less in other tests, especially web-based video, where Flash now dominates.

Linux

Flash is widely reviled by Linux users, who almost universally disdain its proprietary licensing model, lousy performance, excessive resource consumption, poor platform integration, and abysmal lack of stability. Flash is listed by Mozilla's crash reporting system as the number one cause of Firefox crashes on Linux. On Ubuntu's brainstorm web site—where Ubuntu users suggest ways that the Linux distribution can be improved—over a thousand users have voted for items that highlight the need to fix Flash's dysfunctional behavior and frequent crashes.

Although it is still too early to tell if the stability problems have been addressed, the benchmarks clearly show that performance is on the rise. Adobe has also fixed several major bugs and added some Linux-related features that improve the Flash user experience. Benchmarks on Linux, which were conducted with GUIMark on my Mac Pro running Ubuntu 8.04, show modest improvements. Flash 9 clocked in at roughly 14 FPS, while Flash 10 boosted that to about 22 FPS.

Previous versions of Flash exhibit a very frustrating z-ordering bug that caused Flash content to always be drawn on the very top layer, where it could obscure dynamic HTML elements like drop-down menus. This problem was caused by lack of proper WMODE support in Flash and in Firefox. This long-standing bug has finally been fixed, and drop-down menus are now correctly displayed on top. Users will need Firefox 3.0.2 or higher, however, in order for this to behave properly.

Another big improvement for Linux users is the addition of support for Flash's video camera functionality. Adobe's Mike Melanson described this feature back in July when it appeared in beta 2. Adobe collected feedback and camera information from users to help fine-tune the camera support. It is implemented with the Video4Linux v2 API, which is tightly integrated with the Linux kernel and supports a wide range of devices.

Adobe has also stepped up its packaging efforts to better accommodate Linux users. With the release of Flash 10, Adobe's web site now provides the plugin in a convenient DEB package designed specifically for Ubuntu. This is offered alongside an RPM package and a binary tarball. It's also worth noting that the new version of Flash is already packaged and available from the restricted repository in Ubuntu 8.10.

Although Adobe has addressed some of the biggest problems that have been troubling Linux users, there are still some aspects of Flash that will leave them dissatisfied. There are still no 64-bit builds, for instance, and most of the components of the Flash player plugin are still proprietary and closed. Several open source software projects, including Gnash and Swfdec, aim to eliminate this dependence on Adobe by creating completely open implementations of the Flash player. These projects are maturing steadily and could eventually displace Adobe's implementation on Linux.

Conclusion

The latest version of Flash offers tangible improvements for users and for designers. The growing emphasis on strong cross-platform support reflects the increasing relevance of alternative operating systems in the technology industry. Adobe's efforts to improve Flash on Linux and Mac OS X is a tacit acknowledgment of Apple's rising marketshare and Linux's modest success on netbooks. It is also a sign that Adobe is aiming to make Flash a stronger solution for cross-platform application deployment through its popular AIR runtime. This is further illustrated by Adobe's commitment to AIR on Linux and recent decision to join the Linux Foundation.

Flash's massive install base gives Adobe a lot of power, but the company's dominance in the rich web content space hasn't made it lazy. Adobe is clearly capable of holding its own against a Silverlight incursion by delivering technical improvements that ensure Flash doesn't lose its shine.

Update - Now with more Windows benchmarks

Vista

The original focus of this piece was on the platforms where the Flash player has traditionally performed very poorly. But, by popular request, we ran the tests again on the same Quad 2.66 Mac Pro with 6GB of RAM, but this time in Firefox 3.0.2 on Windows Vista running under Boot Camp (no virtualization).

As the numbers will show, Flash performs far better in Vista versus Mac OS X running on the same hardware, and it actually improved slightly with the version 10 update.

  • Flash 9
    • GUIMark: 45 FPS, 54 percent CPU
    • Hulu: 9 percent CPU
    • YouTube: 8 percent CPU
    • 2advanced: 4 percent CPU, peaking at 27
    • Winterbells: 12 percent CPU, peaking at 16
  • Flash 10
    • GUIMark: 46 FPS, 54 CPU
    • Hulu: 7 percent CPU
    • YouTube: 6 percent CPU
    • 2advanced: 2 percent CPU, peaking at 31
    • Winterbells: 9 percent CPU, peaking at 14

Resource - Ars Technica

Google Answers the iPhone

n the exciting new category of modern hand-held computers — devices that fit in your pocket but are used more like a laptop than a traditional phone — there has so far been only one serious option. But that will all change on Oct. 22, when T-Mobile and Google bring out the G1, the first hand-held computer that’s in the same class as Apple’s iPhone.

I have been testing the G1 extensively, in multiple cities and in multiple scenarios. In general, I like it and consider it a worthy competitor to the iPhone. Both devices run on fast 3G phone networks and include Wi-Fi. Both have smart-touch interfaces and robust Web browsers. Both have the ability to easily download third-party apps, or programs.

But the two devices have different strengths and weaknesses, and are likely to attract different types of users.

If you’ve been lusting after the iPhone’s functionality, but didn’t like its virtual keyboard or its user interface or its U.S. carrier, AT&T, the G1 may be just the ticket for you. But it does have some significant downsides.

By far, the G1’s biggest differentiator is that it has a physical keyboard, which is revealed by sliding open the screen. The keyboard proved only fair in my tests, with keys that are too flat and that can be hard to see in bright light, and with a bulge in the body on the right side that you have to reach over to type. But, for the many people who can’t stomach typing on glass, the G1 keyboard will be a welcome sight. It’s complemented by a BlackBerry-like trackball for navigation.


The G1 has a smart-touch screen like its iPhone rival, for Web browsing and downloading programs. But it has a physical keyboard for conventional typing.

The G1 has a removable battery and uses removable, expandable memory cards. And it’s even a bit cheaper than its Apple (AAPL) rival: $179 versus $199. Its data plan also costs less — $25 a month versus $30 — and includes 400 free text messages, which cost extra on the iPhone. There’s also a $35 plan that includes unlimited text messages. And both plans include free use of T-Mobile’s Wi-Fi hotspots.

The G1 has a slick, clever touch interface to go along with its keyboard, and it includes a powerful new operating system. The operating system, called Android, was built by Google (GOOG). It is slated to appear on other phones over time, though it likely will look different on other devices because it is fully open to modification by other companies.

On the G1, the touch interface is fast and smooth. Programs appear when you drag up a tab at the bottom of the screen, and notifications of new messages can be read by simply dragging down the top bar of the screen.

You get much more flexibility in organizing your desktop than on the iPhone. In addition to placing icons for programs there, you can add individual contacts, music playlists, folders, Web pages and more. You just press on the screen for a longer-than-usual time, and a list of items you can add appears. It also has a higher-resolution camera than the iPhone, but like the Apple phone, it can’t shoot video.

It’s also much easier to place a phone call on the G1 than on the iPhone. You can just start typing a contact name or phone number while on the home screen, sparing you the need to enter the phone or contacts program. And there’s a virtual phone keypad that allows you to avoid opening the physical keyboard just to dial a number. It’s also much easier to jump to the top and bottom of long lists.

The G1’s Web browser, built on the same technology as the iPhone’s, worked well at rendering scores of common sites in my tests. You can either pan around pages with your finger, or choose to view the whole page at once and zero-in on a section by moving a small rectangle around.

This first Android phone, which was largely designed by Google and built by Taiwan-based HTC, also includes some key features Apple omitted. These include a limited ability to copy and paste text, and the ability to send photos directly to other phones without relying on email, a common phone feature called MMS, or Multimedia Messaging Service. And, unlike AT&T (T), T-Mobile (DT) will even allow users to legally unlock the phone after 90 days and start using it on another carrier, provided you pay a hefty early-termination fee.

In my battery tests, the G1 lasted through the day, but I had to charge it every night. That’s better than the initial battery life on the current iPhone, though in fairness, Apple has improved the iPhone’s battery life through software updates, and I found them to be about the same for mixed use.

In my talk-time test, the G1 got just under its claimed five hours, about 19 minutes better than the iPhone.

There are two email programs: one for Google’s Gmail, another for all other email services. There’s an instant-messaging program that works with multiple services. There’s one program for accessing Google’s YouTube service and another for Google Maps. The G1’s Google Maps program even has a feature, coming soon as well to the iPhone, that offers photographic street views of certain locations. But the G1, unlike the iPhone, includes a compass that orients the street views as you walk.

The built-in download store for third-party programs, called Market, worked well in my tests. I was able to quickly download games, productivity programs, and other apps and, unlike Apple, Google says it isn’t blocking any programs.

However, the G1 also has downsides. It’s a chunky brick of a device. While it’s a bit narrower than the iPhone and feels OK in the hand, it’s almost 20% heavier and nearly 30% thicker. It also has a smaller screen and doesn’t accept standard stereo headphones.

The G1 also skimps on memory. It comes with only 1 gigabyte of storage, just one-eighth of what the base iPhone offers. To increase the G1’s memory, you have to lay out more money to buy a larger memory card.

The G1 also limits third-party applications to a paltry 128 megabytes of memory. At one point in my tests, after downloading a bunch of third-party programs, and adding songs and videos, the G1 warned me it was running out of room, a warning I have never seen on my heavily used iPhone.

Another downside for some users: The G1 is tightly tied to Google’s online services. While you can use non-Google email and IM services, the only way you can get contacts and calendar items into the phone is to synchronize with Google’s online calendar and contacts services. In fact, you can’t even use the G1 without a Google user ID and password.

The G1 doesn’t allow the use of Microsoft’s Exchange service for email, contacts or calendar items, or any other company’s over-the-air synchronization for contacts and appointments.

In my tests, synchronizing with Gmail, and with Google’s contacts and calendar applications, was smooth and fast. So, the G1 may be great for dedicated Google users, but not so good for folks who rely on competing calendar and contacts services from, say, Yahoo (YHOO) or Microsoft (MSFT). Future Android phones may not be so tightly tied to Google services, but the G1 is.

It also can’t synchronize any data at all directly with a PC or Mac. For instance, it can’t sync with Microsoft Outlook or Windows Media Player on a PC, with Apple’s iCal or Address Book programs on a Mac, or with iTunes on either Windows or the Mac. It has no PC-based synchronization software of its own, and it offers no way to automatically back up your settings, music, applications, videos or photos, either to a computer or to an online repository, though Google says it plans to add a backup feature.

To get Outlook or iCal data onto the G1, you must install add-on software. To get your songs, videos and photos onto the G1, you must plug the phone, or its memory card, into your computer and manually move the files over.

Overall, I found the G1’s user interface inferior to the iPhone’s. It lacks the iPhone’s ability to flick between multiple pictures and Web pages, or to zoom in and zoom out of a photo or Web page by simply using two fingers to “pinch” or expand the image. It also doesn’t automatically change the orientation of the screen from portrait to landscape simply by turning the phone.

Further, many common controls that are easily visible on the iPhone can be accessed on the G1 only by pressing a menu button or by using keyboard shortcuts you have to memorize. Examples are stopping the loading of a Web page or moving forward to the next Web page.

There’s also no on-screen keyboard even for quick tasks, such as typing Web addresses, so you’re constantly having to turn the phone and open the physical keyboard, which quickly becomes a pain.

The G1 also is a greatly inferior multimedia device when compared with the iPhone. Its music player, while adequate, isn’t as nice as the built-in iPod on the iPhone. And it lacks a video player altogether, though a rudimentary one can be downloaded from the Market. The G1 does come with a program for buying songs from Amazon (AMZN), which worked well in my tests.

And then there’s the network. Despite all the troubles AT&T has experienced with its fast 3G network, which is still being built out, that company has 3G service for the iPhone and other devices in 320 U.S. metro areas. By contrast, T-Mobile offers 3G in just 20 U.S. metro areas. Eight more cities are due to come online by year end, which will still leave T-Mobile’s 3G coverage far behind that of AT&T and Verizon (VZ), which will soon introduce its own iPhone competitor, the BlackBerry Storm.

I did 40 speed tests comparing the G1 and the iPhone to see how fast they could download a Web page over 3G. The tests, conducted in Scottsdale, Ariz., and Washington, D.C., showed the iPhone to be consistently faster, by an average of between 50 and 100 kilobytes per second, even though T-Mobile’s network was carrying much less traffic than AT&T’s.

Overall, the G1 is a very good first effort, and a godsend for people who prefer physical keyboards or T-Mobile but want to be part of the new world of powerful pocket computers.


Resource - AllThingsD

Google outs remote kill switch in Android, those rascals

Remember the outrage at Apple's inclusion of a sneaky application kill switch in the iPhone 3G? Yeah, well, Google's got one too. This time, however, it wasn't discovered by some meddling developer, Google owns up to it from right inside the Android Market terms of service:
"Google may discover a product that violates the developer distribution agreement ... in such an instance, Google retains the right to remotely remove those applications from your device at its sole discretion"
Google then claims that it will make "reasonable efforts to recover the purchase price of the product ... from the original developer on your behalf." This on top of the Android Market's policy which allows you to "return" (er, how, it's electronic?) any application within 24 hours for a full refund. Aw shucks Google, come over here and give us a hug.

Resource - Engadget

Worldwide mobile cellular subscribers to reach 4 billion mark late 2008

Geneva, 25 September 2008 — ITU Secretary-General Hamadoun Touré announced in New York that worldwide mobile cellular subscribers are likely to reach the 4 billion mark before the end of this year.Dr Touré was speaking at the high-level events on the Millennium Development Goals (MDGs) in New York, where he also participated in UN Private Sector Forums addressing the global food crisis and the role of technological innovation in meeting the MDGs.

The MDGs were adopted following the United Nations Millennium Declaration by UN Member states in 2000, representing an international commitment to eradicate extreme poverty and hunger, achieve universal primary education, promote gender equality, reduce child mortality, improve maternal health, combat epidemics such as HIV/AIDS and malaria, ensure environmental sustainability, and develop a global partnership for development that would include making available the benefits of information and communication technologies. ICTs have been recognized as an important tool to achieve the MDGs.

Since the turn of the century, the growth of mobile cellular subscribers has been impressive, with year-on-year growth averaging 24 per cent between 2000 and 2008. While in 2000, mobile penetration stood at only 12 per cent, it surpassed the 50 per cent mark by early 2008. It is estimated to reach about 61 per cent by the end of 2008.

"The fact that 4 billion subscribers have been registered worldwide indicates that it is technically feasible to connect the world to the benefits of ICT and that it is a viable business opportunity," said Dr Touré. "Clearly, ICTs have the potential to act as catalysts to achieve the 2015 targets of the MDGs."

While the data shows impressive growth, ITU stresses that the figures need to be carefully interpreted. Although in theory a 61 per cent penetration rate suggests that at least every second person could be using a mobile phone, this is not necessarily the case. In fact, the statistics reflect the number of subscriptions, not persons.

Double counting takes place when people have multiple subscriptions. Also, operators’ methods for counting active prepaid subscribers vary and often inflate the actual number of people that use a mobile phone.

On the other hand, some subscribers, particularly in developing countries, share their mobile phone with others. This has often been cited as the success story of Grameen Phone in rural Bangladesh, for instance.

ITU further highlights that despite high growth rates in the mobile sector, major differences in mobile penetration rates remain between regions and within countries.

The impressive growth in the number of mobile cellular subscribers is mainly due to developments in some of the world’s largest markets. The BRIC economies of Brazil, Russia, India and China are expected to have an increasingly important impact in terms of population, resources and global GDP share. These economies alone are expected to account for over 1.3 billion mobile subscribers by the end of 2008.

China surpassed the 600 million mark by mid-2008, representing by far the world’s largest mobile market. India had some 296 million mobile subscribers by end July 2008 but with a relatively low penetration rate of about 20 per cent, India offers great potential for growth. Market liberalization has played a key role in spreading mobile telephony by driving competition and bringing down prices. India’s mobile operators increasingly compete for low-income customers and Average-Revenue-Per-User in India has reached around USD 7, one of the lowest in the world.

ITU recently published two regional reports for Africa and Asia, which indicate how mobile telephony is changing peoples’ lives. Apart from providing communication services to previously unconnected areas, mobile applications have opened the doors to innovations such as m-commerce to access pricing information for rural farmers and the use of mobile phones to pay for goods and services. While mobile broadband subscribers remain concentrated in the developed world, a number of developing countries, including Indonesia, the Maldives, the Philippines and Sri Lanka in Asia-Pacific have launched 3G networks.

Broadband uptake enables a range of socially desirable and valuable online services, specifically targeting the MDGs in areas such as e-government, e-education and e-health. The use of broadband technologies can help overcome many of the basic development challenges faced by developing countries.


Resource - ITU News

Google Tests the New iGoogle



Announced in April, the new version of iGoogle that brings social applications is tested in a small number of randomly selected Google accounts.

The new iGoogle places the tabs on the left-hand side of the page and you can expand the tabs to see the list of gadgets and status information, like the number of unread Gmail messages. There's a new chat feature borrowed from Gmail that lets you chat with your contacts while visiting iGoogle - that means iGoogle gets a sense of presence because you'll know when your contacts are online. Since the chat feature will be enabled by default, it's obvious that Google will be able to add options for sharing items and discussing posts with the contacts that are online.

iGoogle also adds a list of updates from your contacts similar to Facebook's newsfeed: you can see stories shared by your contacts in Google Reader, recent photos uploaded to Picasa Web Albums, Google Talk status messages, shared iGoogle themes and gadgets.


Another change is that gadgets have an expanded interface, called canvas view. Gadgets authors will take advantage of this to display more information and make their gadgets more interactive, while your feeds can be read in a Google Reader-like interface. In the future, iGoogle will support OpenSocial applications and the transformation to a social site will be complete.

Google announced that the canvas view will be rolled out to a small percentage of users this month and to more users in July, while the OpenSocial applications "will not work in production until later this summer".

Update (Oct. 16): The new iGoogle has been launched.

Resource - GoogleSystems’ Blog

Dynamic programming futures

What will the world of dynamic programming languages and Web applications look like in five years? This is one of those highly personal and deeply philosophical questions best saved for after dessert is served, the drinks are poured, and the sidearms are safely locked away.

At the simplest level, the debate seems crucial. Choose the right language and new libraries magically appear because, well, the coolest programmers use the right language. The hottest languages attract the most developer energy, which usually turns into new libraries with the latest ideas.

Choosing the wrong language means filling your brains with semantic cruft that must be paged out to make room for yet another way of writing a loop. No one will be able to make sense of your code, and no one but you will care.

Most programmers who've been around long enough to survive the rise and fall of programming languages such as Cobol and Fortran recognize that the problem isn't a life-or-death matter. There won't be one winner, and backing the wrong horse won't be fatal. These stable old hands point out that Cobol continues to run strong. At this writing, more than 1 percent of the listings on Dice.com include Cobol. By comparison, JavaScript draws a bit more than 7 percent!

Still, choosing poorly saps one's energy. Some languages will be the dominant choice in certain niches. Choosing poorly means duplicating effort and looking longingly at the fast progress of others.

Commons or craft
Rob Malda, one of the founders of Slashdot, says that he chose Perl for the site because there were so many good libraries available in the CPAN (Comprehensive Perl Archive Network).

"I think Perl's primary advantage in 1997 when I original selected it was the active development occurring on CPAN," Malda explained. "There was a library for everything useful, and usually very quickly. This was critical because new technologies and versions for core functions were updating constantly."

But today, he added, "We have a much better idea of what you need for Web site building, and the tools and libraries have stabilized. All languages can handle the obvious things nice enough now."

This is a nice, politically neutral statement, but it doesn't solve the problem that in many shops, there must be only one Highlander. Only a kindergarten teacher would smile and say that all are equally good.

When a decision must be made, some believe it makes sense to go with popularity. The rich will get richer. PHP is the first language that many people learn after mastering HTML, and it will always be as comfortable as a childhood home. PHP server platforms from Zend Technologies offer better performance, making it possible to write a serious application in the language.

But will PHP be able to shake the casual structure that encourages beginners to whip up spaghetti code? Will it be able to continue to mix the presentation layer and the application layer without driving everyone insane? Will Zend's collection of server optimizations provide enough performance to overcome any limitations of the language?

Some want to place their bets on Ruby on Rails, a striking and elegant solution that produces sophisticated results in no time. A few lines of code produce a full interface with all of the pages necessary to create, update, and delete records.

This simplicity often turns into shackles when the programmers reach the edge of the framework's capabilities. Changing little details or producing slightly unorthodox output can be maddening.

There are many other options. Some developers love Groovy, the dynamic language integrated with the Java API. A programmer gets the rock-solid foundation of compiled Java code mixed with the flexibility to diddle with the Java objects in real time.

And then there are others who see languages such as JavaScript rising from the browser and colonizing the server. A unified platform makes everything simpler. Yes, Netscape wanted this to happen years ago, but thanks to the lightning performance of the new JavaScript semi-compilers, the language is bound to look even more attractive.

All of the languages mentioned above have enough of critical mass behind them to succeed and even flourish in the future. The right answer for you will depend more on the nature of your business and the structure of your data than on whether one platform becomes cooler than yours.

Evolutionary forces
Toward that end, here are 10 principles that will guide the evolution of scripting languages in the future. None of these will offer the definitive answer and save you from a long evening of dessert, liquid refreshment, and debate, but they will provide some guidance that may make the answer appear with more clarity.

1. The semantic barriers won't be as important as the languages rush to steal good ideas from one and other. The dynamic languages are blurring together faster than they're distinguishing themselves.

Larry Wall nabbed Python's object system for Perl, and he and his acolytes are committed to making sure that there are many ways to do anything you want to do in Perl. Language committees are always debating how to weld a great idea from another language into the current one, and this will continue to happen. In five years, there's a good chance you'll be able to imagine you're writing Python while the code is interpreted by something called JavaScript.

3. Applications are becoming their own worlds. There are 23 job listings for WordPress developers. While the WordPress plug-ins will be written in PHP, the programmers will rely heavily on the standard set of libraries included in WordPress. Is it fair to say that the coders are working in PHP, or are they really working in WordPress?

The power of the dominant applications is apparent to everyone. Facebook even calls its scripting language FBJS (Facebook JavaScript) because it's so site-specific.

But there are limits to this cross-pollination. "I don't see this lasting because it's so specific," said David Goodger, a director of Python Software Foundation. "A lot of graphics packages had their own proprietary language for scripting. But then it's this static thing. You don't have the advantage of this vibrant community. If you take this language like Python, you have the advantage of this well-developed tool with the well-developed libraries. You've got the best of all possible worlds."

Still, even if the applications embrace a 100 percent pure version of a language, all of the code will be dominated by the application's API. Look for languages and their syntax to remain relatively pure while the libraries define another language built on top of the first.

4. Communities will be more important. As Goodger notes, Python is especially popular in a few niches, such as the world of bioinformatics and graphics. People who work with synthetic images or DNA results learn Python to do their job. Even if Python dies everywhere else, biochemists are probably still going to be learning Python.

The power of these communities is phenomenal. When Steve Jobs introduced the iPhone, everyone began looking for Cocoa programmers again. Mike Hendrickson, the publisher at O'Reilly Books, said, "We've seen a huge turnaround for Cocoa. It was all but gone a couple of years ago. Now, there's a huge, huge increase in Cocoa because a lot of people want to develop their cool apps for the iPhone."

If Steve Jobs decides that some unary lambda calculus is the language of choice for the iPhone 4.0, the developer community is going to find a way to rationalize his selection and talk about how much they love the language.


5. The Web and the cloud are the ultimate platform. Google's App Engine sparked a huge burst of interest in Python. Perl and PHP were early favorites because they were so well integrated with Apache, a Web server that was both free and easy to configure. Tomorrow's scripting languages of choice will be determined more by the simplicity, cost, and scalability of the hosting platform, not by the purity of the syntactic sugar. Look for such tools as AppJet and Coghead by selling a cloud with a simple scripting language for building the application.

InfoWorld Podcast
Top storage trends and IT consolidation strategies
Sponsored by Sony

6. Better language technology will make a difference. The battle for supremacy between Mozilla's Firefox ("JavaScript, I am your father") and Google's Chrome ("Come live in thread harmony, Luke") is good for everyone. The performance gains these browsers have brought to JavaScript have been dramatic, and they're already making some other scripting languages jealous.

At the end of 2007, Larry Wall wrote, perhaps puckishly, that JavaScript "has some issues, but in the long run JavaScript might actually turn out to be a decent platform for running Perl 6 on."

Sophisticated engines such as SpiderMonkey and V8 show that scripting languages can begin to compete with full compiled code because a smart just-in-time compiler can make guesses about the data that are often good enough.

The stunning performance is bound to attract the attention of folks who dream of running JavaScript on the server. While Netscape tried this idea a long time ago, there's some merit in letting both the server and the client speak the same language. Now the only problem is figuring out which version of JavaScript to use. If history is any indication, it will be just a bit different from all of the browsers.

7. Emulation and cross-compilation will extend the life of dynamic code. Java programmers can use Jython to let Python code control Java objects. Groovy burrows deeply into the Java stack. Google's Web Toolkit converts Java into JavaScript. Watch for the virtual machines from Java and .Net to become even friendlier to changes that come along at runtime.

8. All of the embedding makes it simpler for programming to escape the command line and start appearing in Web applications themselves. Some of the highly customizable platforms, such as WordPress and some Drupal plug-ins, let you add custom code in a Web form.

Uploading JavaScript or Python on the fly to customize a Web application is still only for real programmers, but it will become easier and easier for casual users to avoid bugging the IT staff by writing their own code. Some WordPress plug-ins let users edit the JavaScript that controls the ads. The bloggers may be changing only a few colors and details for Google AdSense, but these Turing-complete mini-sandboxes are going to bring programming to the masses (see "Application builders in the sky").

Watch clouds like AppJet, a Web site that lets you build a Web application with one file filled with JavaScript. AppJet's Web site is the IDE: You just go to a Web page and edit the code, and voila, the code is tested right in your browser.

9. The rise of the amateurs may make much of dynamic programming irrelevant. Web sites such as Coghead (see my review), Caspio, and Microsoft's Popfly let the world do much of the programming without typing any characters at all -- unless they want to put a label on some Web form. All of the instructions for the server are communicated by mouse clicks, lines, and flowcharts. This democratization will create graphical languages that may flourish -- if the creators can make them simple enough for the average human.

InfoWorld Podcast
Top storage trends and IT consolidation strategies
Sponsored by Sony

10. Adaptability for modern architectures is key. David Goodger says that the Python team invests a great deal of time in improving multicore performance. Earlier versions of Pythons could handle threads, but threads were still bound to a single core. That changed after researchers with big data sets pushed for better performance that can take advantage of the hardware.

If your applications are naturally multithreaded, then watch the development of core-savvy languages such as Python and Groovy. If the work you do is limited to a single thread, well, look elsewhere for performance.

The one Highlander
These principles don't lead to one clear answer for the path of dynamic languages and Web development. The real answer may be that anyone can choose any of the languages as long as they make sure they track and navigate these 10 themes.

For instance, simplicity is an important theme as developers move toward elegant solutions. Ruby on Rails is quite popular because of the straightforward syntax and the tight integration with the database. The best frameworks that speed the development of complex, database-driven applications will triumph. But then, we already knew that.

Many other dynamic languages are already borrowing some of the best concepts from Rails. The Java programmers, for instance, can turn to Grails, a simple framework built on top of Groovy and a JVM.

Speed will always matter. For this reason, JavaScript will become more and more useful as the high-powered competition on the Web influences other uses of the language. Other languages will need to either borrow many of the ideas from the JavaScript core or find a way to benefit from them through emulation.

Slashdot co-founder Rob Malda, who chose to build the site on Perl because of all the good libraries in the CPAN repository, sees the features that attracted him to Perl in nearly every dynamic language today.

"Down the road it seems unlikely that we'd rewrite in Perl, but I have no real guess as to what we would rewrite in," he said. "I suspect Rails would be fast enough in five years to consider it, but who knows?"


Resource - InfoWorld