Showing posts with label Technology Market. Show all posts
Showing posts with label Technology Market. Show all posts

Wednesday, December 26, 2007

Digital Camera :Digital SLR market booms


Canon recently announced that it had reach a manufacturing milestone, having now produced more than 30 million of popular EOS digital and film SLRs.
In 1987, on the company's 50th anniversary, Canon unveiled the EOS line with the 650 AF film SLR. The first digital EOS camera, the 30D, made its debut in 2000.

In digital photography, 2007 was a strong year for higher-end digital SLRs.
Already, single-lens reflex cameras were disproportionately popular as photographers moved to models that responded quickly and worked better in dim conditions. The bulk and expense were worth it.
But a panoply of new models arrived to satisfy the needs of experts and professionals in 2007. First was Canon's $5,000 EOS-1D Mark III, a rugged 10.1-megapixel photojournalist model unveiled in March that can shoot 10.5 frames per second. Alas for Canon, the camera's record was blighted with concerns about its autofocus performance.
But the floodgates opened in the second half of the year with Canon's top-end, $8,000 21.1-megapixel 1Ds Mark III. Canon hopes this full-frame model not only wlll keep professional SLR shooters loyal but also to woo studio photographers using even more expensive medium-format cameras. Announced at the same time in August and aimed at the serious enthusiast was the 40D, a $1,300 10.1-megapixel model.
A week later, Canon's biggest rival, Nikon, shot back with the $1,800 D300, and, more significant by far, the $5,000 D3, the first digital SLR to follow Canon's lead with sensors as large as a full frame of 35mm film. Large sensors are expensive, but the extra real estate means that individual pixels can be made larger for a given resolution, and larger pixels can work better in low light. The ISO sensitivity rating of Nikon's D3 goes up to a whopping 25,600.
Olympus, too, released a new top-end model, the $1,700 E-3, and two SLR newcomers expanded their ambitions with their second models: Panasonic's $1,300 (including a lens) 10.1-megapixel DMC-L10 and Sony's $1,400, 12-megapixel Alpha A700.
Makers of compact cameras had a harder time coming up with breakthrough models. Features such as face detection and image stabilization, which most agree genuinely help improve photos, spread from the high end to the mainstream, but those gains were offset by the silliness of the unending megapixel.
Higher-end compact cameras jumped up to 12 megapixels this year, which helps folks who like to crop images but hurts the vastly larger number who want to get something other than multicolored noise speckles when shooting in anything less than broad daylight.
In software, Adobe Systems delivered the biggest changes. For those using the higher-quality "raw" images that good cameras supply, Adobe released Photoshop Lightroom in March, and in just a few months it surpassed in popularity the earlier Apple rival, Aperture. Adobe announced an even more dramatic departure in February by declaring that it would make an online version of Photoshop. Photoshop Express is due in 2008.
Microsoft, meanwhile, made gains with its HD Photo format, built into Windows Vista and designed to replace JPEG with better compression, color, and dynamic range. In November Microsoft said the Joint Photographic Experts Group, which oversees the JPEG standards, would turn HD Photo into a new one called JPEG XR

Thursday, October 18, 2007

All the Energy We Could Ever Need? Space-Based Solar Power Looking Better



Published by the Pentagon's National Security Space Office, the report says the US should demonstrate the technology by building a pilot "space-based solar power" station, big enough to continuously beam up to 10 megawatts of power to the ground, in the next decade.


The good news? Beaming all the solar energy we could ever need down to Earth from space appears more feasible than ever before. The bad news? It's going to take a lot of money and political will to get there.


While the idea of sending giant solar panels into orbit around the Earth is nothing new - the idea has been kicked around with varying degrees of seriousness since the '60s and 70s - changing times have made the concept a lot more feasible today, according to a study released Oct. 10 by the National Security Space Office (NSSO). Fossil fuels are a lot more expensive, and getting harder to access, than they were in past decades. And technology advances are making possible today projects that were all but inconceivable in years past.


"The magnitude of the looming energy and environmental problems is significant enough to warrant consideration of all options, to include revisiting a concept called Space-Based Solar Power (SBSP) first invented in the United States almost 40 years ago," the report's executive summary states.


Oil prices have jumped from $15/barrel to now $80/barrel in less than a decade. In addition to the emergence of global concerns over climate change, American and allied energy source security is now under threat from actors that seek to destabilize or control global energy markets as well as increased energy demand competition by emerging global economies.


By collecting solar energy before it passes through the Earth's atmosphere, losing much of its power, a space-based solar power could provide the planet with all the energy it needs and then some, the NSSO report said. The output of a single one-kilometer-wide band of solar panels at geosynchronous orbit would equal the energy in all the world's remaining recoverable oil: an esimated 1.28 trillion barrels.


Because it didn't have the time or funds to study the feasibility of space-based solar power the traditional way, the NSSO's Advanced Concepts Office (known as "Dreamworks") developed its report through a unique strategy: an open-source, Internet-based forum inviting worldwide experts in the field to collaborate online. More than 170 contributors joined into the discussion, with the mission to answer one question:


Can the United States and partners enable the development and deployment of a space-based solar power system within the first half of the 21st Century such that if constructed could provide affordable, clean, safe, reliable, sustainable, and expandable energy for its consumers?


Their answer, delivered in the form of the Oct. 10 report: it's possible, but a lot remains to be done.


The study group ended up making four major recommendations. First, it said, the U.S. government should move to resolve the remaining unknowns regarding space-based solar power and act effectively to allow for the technology's development. Second, the government should also reduce as much as possible the technical risks faced by businesses working on the technology. Third, the government should set up the environment - policy, regulatory and legal - needed to develop space-based solar power. And, fourth, the U.S. should commit to becoming an early demonstrator, adopter and customer of space-based solar power and set up incentives for the technology's development.


"Considering the development timescales that are involved, and the exponential growth of population and resource pressures within that same strategic period, it is imperative that this work for 'drilling up' vs. drilling down for energy security begins immediately," the NSSO report stated.


If it could be done, space-based solar power would have incredible potential, the NSSO said: It could solve our energy problems, deliver "energy on demand" for troops in the field, provide a fast and sustainable source of energy during humanitarian disasters, and reduce the risk of future conflict over dwindling or risky energy supplies.


Considering that, over the past 30 years, both NASA and the Department of Energy have invested a meager $80 million in space-based solar power research (compared to $21 billion over the last half-century for nuclear fusion - which still remains out of reach as a feasible power source), maybe it's time to directing our research energies - and dollars - upward




Technorati :

Thursday, October 11, 2007

passenger screening machines for Airlines"millimeter-wave passenger imaging technology"


It's the newest weapon in the TSA's air safety arsenal. It's called "millemeter" wave technology" and it's on the job beginning Thursday at the Sky Harbor International Airport in Phoenix.


The machine creates a 3-d image of the passenger's body then sends it to a viewing station in another room where a TSA agent looks for potential threats.


"It's passenger imaging technology, so it allows us to see the entire image of the passenger's body and anything that might be hidden on the person" said Ellen Howe of TSA.


The new technology includes new privacy protection also. The screener in the viewing room can't see the passenger's face and the images from the machine are deleted, once the traveler is cleared to fly.


You'll see the new machine after passing through the first layer of airport security. It's an option for travelers selected for extra screening who don't want to be patted down by an officer.


"This way, they won't have to have anyone touch them, and they can get through the process very quickly" said Howe.


"You don't have to worry about being patted down, they don't have to have somebody there to pat you down. It'll save time, I think, if anything" said traveler Mark Bongiovi.


"Any time they can improve the process, make it more efficient for travelers, it's a good thing" said traveler Wendy Gilpin.


TSA officials say from start to finish the scan takes about 60 seconds. The field tests start Thursday in Phoenix and in the weeks ahead the TSA will be testing in other major cities.


THEnew type of walk-through security machine will debut at several U.S. airports in the coming days as the Transportation Security Administration tries out the latest in body scanning technology.


It's called "millimeter-wave passenger imaging technology," and it produces a more detailed picture than the metal detectors in use now at airports to screen for weapons and explosives..


Because it produces such a detailed image, however, technology and privacy experts at the American Civil Liberties Union are not satisfied that the new technology meets privacy standards.


In a written statement issued Thursday, Barry Steinhardt of the ACLU said the technology can pick up graphic body images and even medical details like whether a passenger has a colostomy bag.


Steinhardt called the screening an "assault on the essential dignity of passengers that citizens in a free nation should not have to tolerate."


TSA spokeswoman Elle Howe said privacy will be respected with the new millimeter-wave technique.


"We want to preserve passengers' privacy and make them feel comfortable with trying a technology like this," she said.


A TSA officer will escort a passenger to the machine for the test, but the person looking at the actual body scans will be at a different location and will not see the passenger, the TSA said.


In addition, the scans will have a "modesty filter" to blur out faces, and no images will be saved.


But the ACLU expressed concern that TSA officers would not be able to resist the temptation to save images of certain people, such as celebrities, and that the plan to blur faces might later be changed.


This is how the new scanners work. The passenger steps into a machine where he or she is quickly scanned with radio waves.


Those waves reflect off the body to transmit a three-dimensional image of the passenger that looks like a fuzzy photo negative. A TSA officer studies the image on a screen and looks for unusual shapes that might mean a passenger is carrying something suspicious.


Passengers who are asked to undergo a second screening can choose a pat-down search or the millimeter-wave test.


The TSA says the machines scan a passenger twice, each scan taking less than two seconds. But it takes another minute or two for a screener to review the scans before signaling a passenger to move on.


The TSA demonstrated the screening technology at a news conference Wednesday near Washington
Howe said the millimeter wave is harmless and "can see more than a magnetometer," which is the first screening machine airport passengers encounter.


"A magnetometer only picks up metal or weapons, so this could see other materials that might be hidden on the body and it also produces an image" rather than just a beep, she said.


Asked if the millimeter wave could detect an object hidden in a body cavity, she said only that the TSA will learn more about the technology as it's tested at U.S. airports.


The TSA has been testing another type of imaging technology called backscatter. This technology also came under some fire because it shows very detailed body images -- which led some critics to call it an electronic strip search. So, the backscatter was altered and blurred to show more of an outline of the body.


The TSA will continue to test the backscatter scanners in some airports. TSA officials said they are a long way from deciding whether they'll settle on just one of these imaging technologies.


Phoenix Sky Harbor Airport in Phoenix, Arizona, begins using the new machines Thursday -- to be offered as an option for people who are asked to be screened a second time.


Los Angeles International Airport in California and John Fitzgerald Kennedy International Airport in New York are also slated to try the machines


Britain, Spain, Japan, Australia, Mexico, Thailand and the Netherlands are using the millimeter wave screening. In the United States, some courthouses and jails are trying it




Technorati :

Tuesday, October 9, 2007

Albert Fert & Peter Grünberg, Physics of Hard Drives Wins Nobel


Two physicists who discovered how to manipulate the magnetic and electrical properties of thin layers of atoms to store vast amounts of data on tiny disks, making iPods and other wonders of modern life possible, were named winners of the Nobel Prize in Physics yesterday.


Albert Fert, of the Université Paris-Sud in Orsay, France, and Peter Grünberg, of the Institute of Solid State Research at the Jülich Research Center in Germany, will share the $1.5 million prize awarded by the Royal Swedish Academy of Sciences.


They will receive the money in a ceremony in Stockholm on Dec. 10.


Dr. Fert, 69, and Dr. Grünberg, 68, each working independently in 1988, discovered an effect known as giant magnetoresistance, in which tiny changes in a magnetic field can produce huge changes in electrical resistance.


The effect is at the heart of modern gadgets that record data, music or snippets of video as a dense magnetic patchwork of zeros and ones, which is then scanned by a small head and converted to electrical signals.


"The MP3 and iPod industry would not have existed without this discovery," Börje Johansson, a member of the Royal Swedish Academy, said, according to The Associated Press. "You would not have an iPod without this effect."


In remarks broadcast over a speakerphone at the academy in Stockholm, Dr. Fert said: "I am so happy for my family, for my co-workers. And I am also very happy to share this with a friend."


Experts said the discovery was one of the first triumphs of the new field of nanotechnology, the science of building and manipulating assemblies of atoms only a nanometer (a billionth of a meter) in size.


The scanning heads in today's gizmos consist of alternating layers only a few atoms thick of a magnetic metal, like iron, and a nonmagnetic metal, like chromium. At that small size, the strange rules of quantum mechanics come into play and novel properties emerge.


The Nobel citation said Dr. Fert and Dr. Grünberg's work also heralded the advent of a new, even smaller and denser type of memory storage called spintronics, in which information is stored and processed by manipulating the spins of electrons.


Engineers have been recording information magnetically and reading it out electrically since the dawn of the computer age, but as they have endeavored to pack more and more data onto their machines, they have been forced to use smaller and fainter magnetic inscriptions and thus more and more sensitive readout devices.


It has long been known that magnetic fields can affect the electrical resistance of magnetic materials like iron. Current flows more easily along field lines than across them. The effect was useful for sensing magnetic fields, and in heads that read magnetic disks. But it amounted to only a small change in resistance, and physicists did not think there were many prospects for improvement.


So it was a surprise in 1988 when groups led by Dr. Fert at the Laboratoire de Physique des Solides and by Dr. Grünberg found that super-slim sandwiches of iron and chromium showed enhanced sensitivity to magnetic fields - "giant magnetoresistance," as Dr. Fert called it. The name stuck.


The reason for the effect has to do with what physicists call the spin of electrons. When the magnetic layers of the sandwich have their fields pointing in the same direction, electrons whose spin points along that direction can migrate freely through the sandwich, but electrons that point in another direction get scattered.


If, however, one of the magnetic layers is perturbed, by, say, reading a small signal, it can flip its direction so that its field runs opposite to the other one. In that case, no matter which way an electron points, it will be scattered and hindered from moving through the layers, greatly increasing the electrical resistance of the sandwich.


As Phillip Schewe, of the American Institute of Physics, explained, "You've leveraged a weak bit of magnetism into a robust bit of electricity."


Subsequently, Stuart Parkin, now of I.B.M., came up with an easier way to produce the sandwiches on an industrial scale. The first commercial devices using giant magnetoresistance effect were produced in 1997.


Dr. Grünberg was born in Pilsen in what is now the Czech Republic and obtained his Ph.D. from the Darmstadt University of Technology in Germany in 1969. He has been asked many times over the years when he was going to win the big prize, and so was not surprised to win the Nobel, according to The A.P.


He said he was looking forward to being able to pursue his research without applying for grants for "every tiny bit."


Dr. Fert was born in Carcassonne, France, and received his Ph.D. at the Université Paris-Sud in 1970. He told The A.P. that it was impossible to predict where modern physics is going to go.


"These days when I go to my grocer and see him type on a computer, I say, 'Wow, he's using something I put together in my mind,'" Dr. Fert said.



iPods, Better laptops Stemmed from Nobel Prize Discovery
The 2007 Nobel Prize in Physics goes for the discovery of Giant Magnetoresistance, a nanotechnology that enables more compact disks to be squeezed into laptops, iPods, and other such devices.



The 2007 Nobel Prize in Physics has been awarded to two researchers for their discovery of Giant Magnetoresistance (GMR), a sort of nanotechnology that enables more compact disks to be squeezed into laptops, iPods and other such devices.


The discovery was made separately in 1988 by Albert Fert of France and Peter Gr|nberg of Germany, though the technology didn't really take hold until the late 1990s.


GMR technology allows for data to be read from very compact disks. Here's a description from the Royal Swedish Academy of Sciences, which doles out the Nobel Prizes:


"A hard disk stores information, such as music, in the form of microscopically small areas magnetized in different directions. The information is retrieved by a read-out head that scans the disk and registers the magnetic changes. The smaller and more compact the hard disk, the smaller and weaker the individual magnetic areas.


"More sensitive read-out heads are therefore required if information has to be packed more densely on a hard disk. A read-out head based on the GMR effect can convert very small magnetic changes into differences in electrical resistance and therefore into changes in the current emitted by the read-out head. The current is the signal from the read-out head and its different strengths represent ones and zeros."


More background about the discovery is available here.


Last year, the prize went to John Mather and George Smoot "for their discovery of the blackbody form and anisotropy of the cosmic microwave background radiation."


The real Nobel Prizes are being announced a week after the quirky Ig Nobel Prizes for weird science were announced at Harvard University.


Research into the mystery of wrinkles on bed sheets, the bottomless bowl of soup and the effect of Viagra on hamster jet lag dominated those awards.


For more on network-oriented research, read our Alpha Doggs blog.




Technorati : ,

Google Buys Into Microblogging (Jaiku)


That was actually a test. If you actually know what Jaiku is, you have probably already heard about the acquisition on Twitter or on Jaiku itself. In other words, you are on the vanguard of trying to evolve a new sensory organ devoted to instantly perceiving what your friends are doing at any moment (and at the same time, how to profit from the latest technology trends).
Otherwise, you probably assumed Jaiku is some game played with dice that Google will put in its employee lounges. I'll bet this second group represents something that rounds easily to 100 percent of the adult population.
For all of those people: Jaiku, like Twitter, is what has become known as a microblogging service that lets people send short blasts of information about themselves to their friends and to the public. The company is based in Helsinki, and was founded by Jyri Engeström and Petteri Koponen. Not surprisingly both have been heavily involved in the mobile phone world. (Here are Google's blog post and Jaiku's FAQ on the deal.)
Despite the obsession of a small corner of Silicon Valley with Twitter, I suspect this is hardly a blip in the evolution of the Internet. The terms of the deal were not announced, but doubtless the company was sold for an amount in the millions or low tens of millions of dollars.
Google is not picking up a significant number of users in buying Jaiku. And I don't see any evidence that Jaiku has technology that is very hard to build. So we've got to assume Google is paying a lot of money to hire a small group of engineers it likes, as it tends to do.
This may also be a sign that Google has overstaffed its business development department and is doing deals just to keep them busy.
Still, Jaiku and Twitter, which recently raised money from Union Square Ventures, are onto something. AOL Instant Messenger showed that there is something very engaging about watching what other people we know are doing - logging on and off, putting simple information in their 'away' messages. Facebook found a way to amplify this with an easy to update "status" message, brilliantly aggregated into a personal newsfeed for each user. Twitter and Jaiku, of course, are the newsfeed without the rest of the service.
So the question here, of course, is whether status updates really will become a mass product on a standalone service, or whether they will be a feature of some other more complex offering.
You've got to bet that status, presence and so on constitute a feature. It's too easy to add these to other services that are more engaging. And I suspect that there are enough other sites wanting to expand their use for social communication that there will be many offers for Twitter whenever it decides it's time to sell.
Google, after all, has decided that it is simply too complex to create a new interface for each good idea and has been on a campaign to focus on developing "features not products." The best example of this is the integration of its instant message system into Gmail. Indeed, you can already see little orange icons showing which of your Gmail contacts are online at any given moment. And it is easy to imagine that this interface could easily add a stream of text or photo blasts too.
I'm sure some users would like that. What's not clear is why Google needed to buy a standalone company to offer it.
By the way, I asked Google for comment and haven't heard back yet. I'll update this post if they reply and add anything.


UPDATE: I just ran across this bit of fan mail to Jaiku from Tim O'Reilly. He is particularly enamored of how the service can integrate into the address book of a few high-end cellphones. As you start to dial a person, you can see their latest status update and where they are. As Google moves into the phone software business, it's possible that this sort of feature might be interesting. Google certainly has a fondness for services that relates to geographical location.
Source:


Google buys Finnish startup Jaiku.


Google announced on Tuesday it is buying Jaiku, a Finnish startup specializing in letting friends use mobile telephones to share what they are doing at any given moment.


Google is making a priority of following Internet users as they go mobile and is even reported to be crafting a "gphone" with an open-source software platform tailored to its online services.


Jaiku is a social networking and mini-messaging service that enables people to keep track of each others' activities while on the move using curt missives sent to mobile telephones.


The Helsinki-based firm founded early last year by Jyri Engestrom and Petteri Koponen has been compared to the popular US-based service Twitter.


"Technology has made staying in touch with your friends and family both easier and harder," Google product manager Tony Hsieh wrote in a posting on the California firm's website.


"Living a fast-paced, on-the-go lifestyle is easier (and a lot of fun), but it's more difficult to keep track of everyone when they're running around at warp speed. That's why we're excited to announce that we've acquired Jaiku."


Financial terms of the deal were not disclosed.


Last month, Google's quest for devotees in the booming world of mobile online services led to its purchase of Zingku, a startup company that streamlines sharing pictures, messages and more via smart phones..


About Jaiku
Jaiku is now a part of Google. For more details about Jaiku and Google, see the Q&A about the acquisition.


Jaiku's main goal is to bring people closer together by enabling them to share their activity streams. An activity stream is a log of everyday things as they happen: your status messages, recommendations, events you're attending, photos you've taken - anything you post directly to Jaiku or add using Web feeds. We offer a way to connect with the people you care about by sharing your activities with them on the Web, IM, and SMS - as well as through a slew of cool third-party applications built by other developers using our API.


The most powerful instrument of social peripheral vision is your mobile phone. We've put in a special effort to create Jaiku Mobile, a live phonebook that displays the activity streams, availability, and location of your Jaiku contacts right in your phone contact list. We modestly believe it is the best solution out there for seeing what your friends are up to. Currently Jaiku Mobile is available for phones based on the Nokia S60 software platform (see the list of compatible devices).


Check out our Jaikido blog for updates about the service. We appreciate your feedback, so feel free to comment away on the blog - or join our feedback and ideas channel.


For an insider's view into things happening at Jaiku, follow the updates from Jaiku Team




Technorati : ,

Monday, October 8, 2007

Fiber-Optic Future For Norwich


As the dispatcher calls the volunteer department, a computer prints a picture and exact directions and sends them electronically to the responding station or even to a portable computer in the firetruck as it speeds out of the station bay.


Imagine a fire call to the city central dispatch for a remote location within the city limits.


That would be one way a municipal fiber-optic network could help city agencies and the general public, said John Bilda, general manager of Norwich Public Utilities.


NPU went out to bid Wednesday on installing a fiber-optic telecommunications system in the city that would connect all schools and municipal and public-utilities facilities, including automated sewer pump stations, hydropower units and electrical transformers.


The 32-mile, $2.4 million network would snake through the city in two main loops, with several spurs from the main loop lines to connect more remote systems.


The network, Bilda said, would send data 600 times faster than current speeds along privately owned data lines, and do it more reliably.


Mayor Benjamin Lathrop called officials of the city-owned utility "visionaries," dating back 104 years to when the city took over by eminent domain a private electric and gas company and converted it to a public utility.


"They were visionaries then, with the (public water) reservoirs and all," Lathrop said, "and by exploring what they did all those years ago to move their city forward. It's impressive. Our utility has done wonders."


Immediate plans would have the fiber-optic network serving only Norwich government entities - adding in agencies such as the Uncas Health District, Three Rivers Community College and Norwich Free Academy - and would provide internal communication only within Norwich borders.


A teacher in a Norwich school could draw a line on a computer and have it automatically appear simultaneously on so-called smart boards in every school in the city. But all the sites would still use AT&T for telephone service and 99 Main - the city's Internet provider - for access to "the outside world," Bilda said.


NPU plans to create wireless hot spots in key locations, such as downtown, that would be available to the public, Bilda said.


Expanding the network to local businesses and residents could follow. NPU plans to apply to the state Department of Public Utility Control for permission to offer service, according to the resolution approved by the City Council Monday.


Bilda couldn't say when that might occur, but he said the cable to make it possible could be in place by next summer. The rest could depend on the DPUC licensing process and the city's desire to open it up to the community.


NPU is a pioneer in municipal fiber-optics installation in the state, but not the first, Bilda said. About five years ago, the town of Manchester helped write the law that now allows NPU to move forward.


DPUC spokeswoman Beryl Lyons said no other city-owned utility has applied for a state license to offer fiber-optic broadband to the general public. Only the few municipally owned utilities that own their own utility poles would be able to tackle the project, she said.


Groton Public Utilities launched its own for-profit cable television company, Thames Valley Communications, three years ago. The cable television and computer broadband company now has 7,000 customers in Groton city and town, and the Groton portion of Mystic and Gales Ferry, said Carl Andersen, marketing director.


Andersen said it has taken longer than expected to get permission to build on poles outside the Groton Public Utilities service area.


NPU owns all its utility poles and many underground utility conduits in the Norwich Business Park.


NPU has no plans to start a cable television company or become an Internet provider or telephone company. Rather, Bilda said, the network would allow NPU and other city entities to greatly consolidate telephone service, buying one telephone-trunk service line from AT&T and using its own fiber-optic network to hook up to numerous telephones and computers.


One of the utilities' aims is to save on its telephone bills. "We're doing this to stay in business," Bilda said.


The city plans to continue to use 99 Main as its Internet provider, but city computers would be able to communicate with one another much faster and at higher capacities.


If the fiber-optic service is expanded to local businesses and the public, Bilda said, NPU would not make it a for-profit venture. Ten percent of the gross revenues would be turned over to the city, a deal that dates back to the founding of the public utility.


If Norwich wants to venture into cable television or telephone service, the City Council would have to authorize the move. Bilda said the initiative would have to come from NPU constituents.


"We want to do whatever the community wants us to do," Bilda said. "This provides the backbone for any of these services to happen."


At least two downtown business owners are counting the days when fiber-optic broadband data transmission service might be available.


Mike Sullivan, owner of 99 Main, said his company has been the city's Internet provider for 11 years. He said the connection would enable him to offer high-speed fiber-optic Internet connections to local small businesses that can't afford the high-speed T-1 lines that are now the standard for high-speed connections.


Fiber optics would far surpass T-1 capacities, Sullivan said. The smallest fiber cable can transmit data at a rate of 155 megabytes per second, while a T-1 line sends at one megabyte per second. Slower DSL lines are still the standard for home and small business use, he said.


Brian Kobylarz, owner of Tele-Cine Productions, served on the initial focus group NPU established several years ago when utility officials first started looking into expanding to cable television and fiber-optic broadband services.


Kobylarz said any business with electronic data needs would benefit. He produces high-definition videos and films for industrial, business and government entities.


Fiber optics would give him quicker, better quality transmissions of video clips to production studios "miles or hundreds of miles away."


Kobylarz, who also chairs the Downtown Neighborhood Revitalization Zone Committee, envisions the fiber-optic network attracting high-tech businesses to the downtown.


"Major corporations have realized the benefits of this technology for many years now," Kobylarz said. "The business model says this is the right thing to do. What we are doing in Norwich is the first step. It will be a better step when it begins to open up to the business community. That will spur economic activity and will attract new and better businesses to the area."






Technorati :

Tuesday, October 2, 2007

Having the space program as a very challenging real-world mission to focus tech development around was tremendously inspiring and productive


"Having the space program as a very challenging real-world mission to focus tech development around was tremendously inspiring and productive." --Scott Fisher, founding director, Virtual Environment Workstation Project at NASA Ames


A half-century of space flight
In celebration of the 50th anniversary of the Sputnik launch on October 4, we take a look back at some of the ships that have helped humans explore space and some of those that might do so in the near future. Forget about the Xbox and the iPhone. This is some serious hardware.


The launch of the basketball-size satellite is widely considered the dawn of the space age, and began the space race between the United States and the Soviet Union.


In 1955, both the United States and the Soviet Union announced plans to launch satellites into orbit as part of the International Geophysical Year, which had been established to take place from the middle of 1957 through the end of 1958.


The U.S. may have announced its plans first, but the U.S.S.R. got off the ground startlingly fast. Sputnik I, pictured at left, launched October 4, 1957, raising fears among Americans that it gave the Soviet Union a leg up on the U.S. not only technologically, but in the ability to launch nuclear missiles.


Sputnik II, carrying a dog named Laika and a much heavier payload, soon followed, launching only a month later on November 3.


For anyone who's ever been stuck in rush-hour traffic on U.S. Highway 101 through Silicon Valley, the region's overgrowth of green-glass office buildings, ugly tech company headquarters and expensive cars is a frustrating flip side to the steady stream of world-changing innovation that has emerged there.


But if you'd visited the region in 1930, all you'd have seen was a two-lane highway cutting through acres and acres of nothing but farmland and tiny hamlets, and not even a hint of what would someday become arguably the most important commercial technology center in the world.
In December of that year, however, word came that the U.S. Navy was going to open an air station in Sunnyvale, Calif., one that would handle gigantic airships and that would need a mammoth hangar.


The result? The Sunnyvale Naval Air Station, later known as NASA Moffett Field. And today, Moffett is home to NASA's Ames Research Center, a facility that is at the heart of Silicon Valley, both geographically and figuratively. In 1930 the region didn't know what was about to arrive, but it soon realized how much change was coming.


"Industries allied to aviation will spring up like mushrooms, each bringing its own payroll," wrote the San Jose Mercury Herald in 1931, according to NASA. "It means in short that San Jose and the Bay region are on the threshold of the most glorious era of posterity in their history."


Usually, such proclamations fall short of reality, but on this the newspaper was spot on. While the projected growth was expected to be tied to aviation, not space research, the arrival in 1939 of the National Advisory Committee for Aeronautics--the precursor to NASA--and later NASA itself helped drag the Valley into the center of American industry.


Of course, Silicon Valley has grown way beyond NASA since the Apollo program was leaning on researchers from Stanford, nearby University of California at Berkeley and a number of small companies that started to dot the area in the 1960s. But in the crucial early years of the Valley's technology industry, government contracts played a key role.


"Several companies in what would become Silicon Valley benefited from the ambitious goals and budget largesse of the Apollo space program," said Dag Spicer, the senior curator of the Computer History Museum, in Mountain View, Calif. "The stringent quality and performance requirements of (integrated circuits) for Apollo allowed early semiconductor companies to learn at government (that is, public) expense, a technology that would soon have broad application and whose price would plummet as these companies perfected manufacturing methods."


A list of companies that emerged to take advantage of NASA's work on integrated circuits would be impossible to compile today, but there's no doubt that among the biggest winners on such a list would be Fairchild Semiconductor, and Intel, which was founded by Fairchild's Robert Noyce and Gordon Moore.


"Fairchild...was likely the largest recipient of government-related integrated circuit work," said Spicer. "The irony of these early contracts was that, while they were welcome in the early 1960s (when) semiconductor companies were learning how to make integrated circuits, by 1970, government/military work was frequently viewed as a damper on profits and innovation since it took people and resources away from research and development into newer and more profitable commercial products."


Nonetheless, the Apollo program turned out to be a fantastic source of technology that would eventually find its way into commercial products and applications. Also among the companies that would most benefit from the program was Hewlett-Packard. HP's association with the space program, in fact, pre-dates NASA, according to Measure magazine.


"HP's instrument sales force has been selling to the space program since the 1950s, before NASA was formally created," wrote Measure magazine in 1983, according to information provided by Devon Dawson, an archivist for HP spinoff Agilent Technologies. "NASA and its contractors use instruments from virtually every HP division to develop, test and support the sophisticated electronic equipment used in all NASA programs."


Specific instances of the HP-NASA alliance on the Apollo 11 program abound, Measure wrote in its September 1969 issue: The launch control facility at Cape Kennedy and Houston's Mission Control Center both utilized HP technology such as FM-AM telemetering signal generators and RF vector impedance meters. And, HP's Precision Frequency Source keyed to a cesium clock built by the company "provided the precise frequency outputs used for thousandths-of-a-second accuracy throughout the worldwide Apollo network of tracking stations and communications systems."


The relationship between HP and NASA has stayed strong, Dawson said. Among the space programs employing HP or Agilent technology are space shuttle missions, Mariner missions, Voyager 2, the 1995 docking of the Atlantis shuttle with the Mir Space Station and the Lunar Prospector in 1997.


But the impact of the Apollo program on commercial technology goes far beyond such highly specialized equipment and missions. According to Bruce Damer, founder of the DigiBarn computer museum and a frequent NASA contractor himself through his company DigitalSpace, it's possible to draw a direct evolutionary link between the simple flight simulators NASA was using for the Apollo astronauts in 1967 and 1968--what he called "one of the first highly interactive computer environments"--and some of the early commercial video games.


Similarly, NASA's work with wind tunnels at Moffett became so expensive that the agency decided to turn to supercomputers for more cost-effective simulations.


And that, in conjunction with work done at Ames on tele-operations and telepresence--research that tried to simulate the interior of the space shuttle--led to the creation of 3D graphics, head-mounted displays and early virtual reality technology, all partially funded by NASA.


"Starting in the 1960s, as the needs became more necessary...I think that drove the research on graphics tech and certainly computing in general," said Scott Fisher, chair of the interactive media division in the University of Southern California's School of Cinematic Arts, and the founding director of the Virtual Environment Workstation Project (VIEW) at NASA Ames. "When we built a real-time virtual environment system and the flow visualization guys used it to input their data, they were ecstatic that they could manipulate viewpoints into their data by just moving their head or walking around in the data as opposed to typing in a set of coordinates for each new viewpoint."


Another technology to come out of NASA and later find its way into industry was the use of audio technology in pilots' computerized interfaces, said Fisher.


"NASA did lots of work on finding the best ways to alert a pilot to some system problem," Fisher said. "Audio turned out to be very effective." Now, nearly 20 years later, the technology is making its way into video games and other off-the-shelf commercial systems, he said.


The relationship between NASA and space technologies and Silicon Valley and the companies that have blossomed there may best have been summed up by Northrop Grumman chairman and CEO Ronald Sugar in a speech he gave on September 20 at the 50th Anniversary of Space Exploration conference in Pasadena, Calif.



"Space exploration and use has created new industries that today generate billions of dollars of revenue, employ millions of people worldwide and improve the lives of virtually everyone," said Sugar. "Space, which first served as a coliseum for two grappling superpowers, now welcomes new nations to explore and utilize its potential, and in the process, draws all mankind closer together."


Of course, for those who work or worked in the space industry, the experience of being involved with such technologies and seeing how they affected the rest of the world is something that will always be special.


"Having the space program as a very challenging real-world mission to focus tech development around was tremendously inspiring and productive," said Fisher.






Technorati :

Reader Digital Book could turn some heads among gadget lovers


www.24hoursnews.blogspot.com


Technology Market is really challenging ......challeging through quality....chalenging through ..Marketing


While it may not pack the sales bang of Harry Potter and the Deathly Hallows, the latest edition of Sony Electronics Inc.'s Reader Digital Book could turn some heads among gadget lovers when it is released this month.


Sony announced the latest edition of the Reader, model PRS-505, on Tuesday, and said it will be available in the U.S. this month at Sony Style stores and on the Sony Style Web site as well as at Borders Inc. book stores. For US$300, people will get a paperback book-sized Reader in either silver or dark blue, which can hold up to 160 books.


A Sony spokesman was unable to provide a specific date for the launch of the new Reader.


To get people started on their new Reader, Sony is offering credit for 100 classic books, including the works of Shakespeare and Jane Austen, on Connect, an eBooks store set up by Sony. The site includes 20,000 eBooks, including the latest editions of many top authors and much of the New York Times Bestsellers' list. It doesn't include any books in J.K. Rowling's Harry Potter series.


Improvements to the reader include nearly twice the storage space, a battery that will last around 7,500 page views, new controls that are redesigned to mimic page-turning and allow quicker navigation, and a USB (Universal Serial Bus) port allowing the transfer of data from a PC. The new Reader also includes slots for Memory Stick Duo and SD memory cards to increase storage capacity.


An auto-sync feature with the new edition allows users to create a folder for books and documents on their computer with which they can automatically synchronise the Reader.





Technorati : ,

Tuesday, September 25, 2007

Zero-Gravity Surgical Robot


A nonprofit R&D organization, will conduct the first demonstration of a teleoperated surgical robot in a zero-gravity environment this week. The robot is controlled with a special interface by a skilled surgeon hundreds of miles away.


The SRI robotic surgical system is designed to be stored in a very compact space for space travel. Astronauts will reassemble the device for use in the event of illness requiring surgical intervention.



The system was successfully tested underwater in the Aquarius undersea laboratory off the coast of Florida earlier this year. A Canadian surgeon successfully utilized the device to perform a vascular suturing operation from fifteen hundred miles away (see photo).



Now, however, SRI researchers are testing the device in the extreme environment of zero gravity. The tests will be done over a period of four days aboard a NASA C-9 aircraft. The plane undergoes a series of parabolic flight maneuvers that simulates, for a brief period, the microgravity environment of space.


"In previous experiments, SRI successfully demonstrated how robots can be manipulated remotely and set-up with minimal training. We are now extending that technology to movement and weightlessness, critical elements of any space travel program," said Thomas Low, director of SRI's Medical Devices and Robotics program.


SRI-developed software is intended to help the robot compensate for errors in movement that can occur in moments of turbulence or transitions in gravitational field strength. The experiment will compare the same surgical tasks performed by a physician who is physically present on the plane with those performed remotely using the teleoperated robot.



SRI is pioneering other remotely-operated surgical systems; they are working with DARPA on the Trauma Pod Battlefield Medical Treatment System; The trauma pod is used to treat soldiers on the battlefield using advanced diagnostics and teleoperated instruments.


Science fiction writers were arguably the first to imagine such things; the telemedicine apparatus from E.M. Forster's 1909 story The Machine Stops is a very early inspiration to real-life roboticists. More recently, science fiction writer Peter Watts vividly visualized a teleoperated medical mantis that could perform surgery deep beneath the sea's surface.





Technorati :

Monday, September 24, 2007

improving Vista


who blasted Microsoft three months ago for failing to deliver Windows Vista add-ons have again called the company on the carpet, this time for missing its self-imposed deadline to provide promised extras.


In late June, bloggers and users were already panning Vista Ultimate Extras as a bust. Extras, available only to customers running the top-end Vista edition, was one of the features cited by Microsoft to distinguish the $399 operating system from its $239 cousin, Home Premium. Microsoft's online marketing, for instance, touted Extras as "cutting-edge programs, innovative services, and unique publications" that would be regularly offered to Ultimate users.


But by June, Microsoft had not released any new Extras since it issued a beta of DreamScene, a video screensaver, in February. That infuriated some users; several days later, Microsoft tried to defuse the situation by promising to wrap up DreamScene and 20 unfinished language packs ... "By the end of the summer


Companies planning to roll out Microsoft Corp. 's Windows Vista operating system can thank people like Eric Craig.


Craig, a managing director at Continental Airlines Inc., started moving his company to Vista early this year, far sooner than most businesses. For Continental and other early Vista adopters, being first meant wrestling with an array of challenges that include security programs that weren't ready and problems with "driver" software for running printers and other devices.


Such pioneers are an important part of Microsoft's strategy, which employs customers' feedback to help improve Vista and the software universe designed to work with it.


"We're happy to share our experience," Craig said.


That is good news for the followers, many of whom remain daunted by the prospect of moving to Vista. Indeed, a recent Forrester Research survey of 565 companies in the United States and Europe with more than 1, 000 employees showed that only 7 percent plan to start rolling out Vista this year, with 25 percent expecting to begin the process next year. Some 38 percent of respondents said they didn't have plans yet to move to Vista.


Why the reticence ? By the time Microsoft first made Vista available to businesses last November - five years after its predecessor Windows XP - companies had built or bought many layers of software on earlier versions of Windows, including security software to guard against viruses and other malicious computer code. Moving to Vista requires modifying those software layers, running the risk of creating new problems.


And there is simply no substitute for actually installing Vista in real businesses to find the glitches, and to fully test the attendant software - like the drivers - that has been tweaked to run with Vista. Microsoft provides free software tools and programs to help with the migration, but those programs also must be enhanced based on experience gained through their use.


Roughly nine months after Vista was made broadly available, information-technology professionals say installing Vista is gradually becoming easier.


"If you're an IT person sitting down to do this you have much better information available than you did six months ago," said Steve Kleynhans, an analyst at Gartner Group. "Six months from now it's going to be even better."


Continental since May has replaced nearly 2, 000 personal computers at two of its three reservation centers with Vista PCs. The work was delayed a bit as Continental waited for certain Vista-compatible drivers, security software and other software from Microsoft partners, Craig said. Continental is now working with Cisco Systems Inc. to rejigger the network-equipment maker's call-center software to work better with Vista, Craig said. By sometime in the first quarter of next year, Craig expects to have converted up to 7, 000 of his company's 18, 000 PCs to Vista.


Another organization that experienced the trials is the Australian Customs Service, which decided to shift a fleet of aging PCs to Vista earlier this year. By dumping 3, 000 old machines for new ones that come with Vista, the agency avoided some onerous chores associated with upgrading existing hardware. But it still had problems with a security feature in Vista called BitLocker.


The customs service was able to get Hewlett-Packard Co., the PC supplier, to provide a revised piece of software that fixed the problem. Such fixes are typically shared with Microsoft and other makers of software tools to help Vista buyers.


"The third-party support has been a little bit slow but that's progressively gotten better," said Murray Harrison, the customs service's chief information officer.


Microsoft provides a free downloadable software kit businesses can use to upgrade existing PCs to Vista. The kit, called the Business Desktop Deployment tool, automatically transfers information over the Internet about companies' IT systems and technical problems to Microsoft. The data are used to help improve the upgrade kit. So far, customers have downloaded about 220, 000 copies of it, Microsoft executives say.


Customer feedback has also aided the process of modifying the driver programs. Microsoft executives say the number of driver programs for Vista has swelled to 2. 2 million from 1. 5 million in January.


Another critical issue is application compatibility - whether Vista can run the business programs that companies now use. Some are sold by software companies and some are internally developed by users. Companies may have thousands of different applications that need to be tested for compatibility before a new operating system is introduced.


Microsoft runs a program for certifying software applications for Vista. As of last week, about 2, 100 applications had been certified - up from 250 in January, Microsoft executives say.


Meanwhile, many IT managers are waiting for Vista improvements that will be included in Service Pack 1, a set of software enhancements for Vista that will have new drivers, security features and bug fixes. Microsoft says the software will be available in January.


Still, some businesses don't see enough benefits from the new software to go through all the trouble.


"We don't have any plans for rolling out Vista in our environment in the near future," said Gentry Ganote, chief information officer of Golf and Tennis Pro Shop Inc., which runs sports shops around the U. S. Instead, Ganote said his company is shifting more of its employees from PCs to more simplified devices known as thin clients, which he said will be easier for his IT group to manage.


Earth Tech Inc., though a big user of Microsoft software, doesn't see enough new value in Vista itself to justify upgrading its 8, 000 PCs, said Jim Walsh, its chief information officer.


Walsh, whose company handles infrastructure work that includes building roads and airports and managing water-treatment facilities, is studying whether software that runs on Vista - including the new version of Microsoft's Office software and software called SharePoint - offer enough benefits to justify a move to Vista.


And like his peers at many other companies, Walsh is concerned about problems arising from adding Vista to his company's IT system, which has a mix of both Microsoft and software from some 25 other companies.


"I'm generally personally afraid of the integration issues," he said. "Can the existing software I have run on it ? I have to make sure it does."


Such thinking is one reason why a significant number of businesses won't start the transition until next year, said Benjamin Gray, an analyst at Forrester Research. Still, with the gradual improvements to the infrastructure around Vista, for most businesses "it's not a matter of if, it's a matter of when and how," he said.





Technorati :

Sunday, September 23, 2007

Thrive Across Sectors - Oracle Posts 25% Rise



Oracle Corp. continued a recent winning streak, even in what is typically its weakest time of the year.


The software giant reported a 25% jump in earnings and a 26% rise in revenue -- surpassing its predictions -- in its fiscal first quarter, ended Aug. 31. The gains were driven by steady customer purchases of Oracle's cash-cow database and middleware software and by the company's recent acquisitions of several makers of so-called business applications.


Oracle, of Redwood Shores, Calif., is best known for its database software, which helps companies retrieve all the information they have stored digitally.


Melding a hodgepodge of different software makers into a cohesive business is supposed to be difficult. But Oracle Corp. is making the task look easy as the Redwood Shores-based company churns out one impressive quarter after another, nearly three years into a $25-billion shopping spree that is yielding bigger dividends than many skeptical software analysts and executives anticipated.


The latest gains surfaced late Thursday when Oracle reported that fiscal first quarter software sales accelerated at the fastest clip in seven years.


Propelled by the robust growth, Oracle earned $840 million, or 16 cents per share, for the three months ended Aug. 31. That represented a 25-percent improvement from net income of $670 million, or 13 cents per share, in the same period last year.


"The numbers are very strong," said Global Equities Research analyst Trip Chowdhry. "Is their strategy paying off from a financial perspective? The answer is yes."


Investors seem to share that opinion. Oracle shares jumped to a new 52-week high of $22.17 on Friday before edging back somewhat to $21.98.


Oracle's market value has climbed by about $40 billion, or more than 50 percent, since its flamboyant chief executive, Larry Ellison, began snapping up the company's smaller rivals in 2004. Oracle has devoured more than 30 companies so far in a bid to lure customers away from SAP AG, the leading seller of business applications software that helps companies manage their operations.


Ellison, whose fortune has swelled to an estimated $26 billion during the run-up in the company's stock, launched the expansion to build upon Oracle's long-established dominance in the database software market.


Although Oracle still trails SAP in revenue from business applications software, the gap separating the two appears to be closing, said AMR Research analyst Bruce Richardson.


"I would say that in the last 12 months, Oracle has certainly established itself as a much more viable software provider," Richardson said.


If not for stock option expenses, Oracle said it would have made 22 cents per share during the first quarter, a penny above the average estimate among analysts polled by Thomson Financial.


Quarterly revenue totaled $4.53 billion, 26 percent above the $3.59 billion in the same period last year, and easily surpassed the average analyst estimate of $4.34 billion. Revenue would have risen 22 percent if not for a weak dollar that bolstered international sales.


Perhaps most importantly to investors, Oracle's sales of new software licenses climbed 35 percent to $1.09 billion, soaring past both management and analyst projections. Analysts had been anticipating an improvement in the mid-20-percent range.


Wall Street focuses on software sales because new licenses establish a pipeline for much greater future revenue from product upgrades and maintenance.


The spike in Oracle's first-quarter software sales was the largest since free-spending Internet startups were driving demand in 2000, said Safra Catz, the company's chief financial officer.


Demand for Oracle's applications software was particularly strong, with sales in that category rising 65 percent to $376 million.


The improvement included about $87 million in sales from two recently acquired companies, Hyperion Corp. and Agile Software Corp., whose products weren't sold by Oracle last year.


"It was a strong quarter across the board," said Piper Jaffray analyst Ajaykumar Kasargod. "The (sales) execution is really coming along."


Catz predicted Oracle's momentum will continue in the current quarter. She forecast the company's software sales will rise 15 percent to 25 percent in the three months ending in November to produce earnings of 26 cents or 27 cents per share, excluding stock option expenses, for the second quarter.


The first quarter is usually Oracle's weakest sales period because so many key decision makers take summer vacations.


"If things weren't going well, this is where you would see it," Catz told analysts during a Thursday conference call.





Technorati :

High-Speed Pacific Cable - The Race to Wire Up


Google Inc. is in early discussions to join a group looking to lay a high-speed trans-Pacific undersea cable that could potentially result in the Internet company's becoming an investor in the project, according to a person familiar with the matter.


The discussions highlight the growth of Google's infrastructure requirements as it continues an ambitious international expansion and increasingly offers data-intensive services such as online video and email, and online word processing for businesses.


The talks also come amid a resurgence of interest in laying such fiber-optic cables under the Pacific as use of the Internet and international phone service has grown quickly in Asia, making capacity tighter and often technologically outdated. An earthquake off Taiwan's coast last December that disrupted Internet service in much of China when it damaged undersea cables heightened calls for new lines to reduce risks of further disruptions. Some of the world's biggest telecom carriers -- including Verizon Communications Inc. and AT&T Inc. -- are already moving to build new trans-Pacific cables to keep up with the surging volumes of Internet and phone transmissions.


"Additional infrastructure for the Internet is good for users and there are a number of proposals to add a Pacific submarine cable," said a Google spokesman in a statement, declining to comment further. The talks were reported earlier in Australia's Communications Day. The person familiar with the matter said the discussions remained fluid.


The talks come amid longstanding speculation about Google's intentions in the telecommunications industry. Such speculation has centered on whether it plans to offer Internet or voice access broadly to consumers, such as through a U.S. wireless-spectrum license Google has said it will likely bid for next year. One person familiar with the carriers' thinking said that the potential undersea fiber-optic investment could reflect Google's recent push to provide Internet-based services such as email and word processing to businesses, since companies have lower tolerance for service interruptions and offices around the world.


Colby Synesael, an analyst at Merriman Curhan Ford & Co., said that the potential undersea cable could allow Google to have greater control over its operating costs and infrastructure needs. He said he isn't concerned that one more cable under the Pacific would create a glut of capacity. "Even if it does create a bubble today, I would argue that capacity demand would catch up in two to three years," said Mr. Synesael.


The Race to Wire Up


Google may be the ultimate do-it-yourself company. From the start, Google's sense of its own engineering superiority, combined with a tightwad sensibility, led it to build its own servers. It writes its own operating systems.
It is now threatening to buy wireless carrier spectrum and it is getting ready to hire ships that will lay a data communications cable across the Pacific, according to a report from Communications Day, an Australian trade news service.
Google would plan to be part of a project called Unity that would also include several telecommunications companies. Unity hopes to have a cable in service by 2009, the publication wrote. It would own a dedicated portion of the multi-terabit cable, giving it a significant cost advantage for trans-Pacific data transmission over rival Internet companies.
Barry Schnitt, a Google spokesman, didn't confirm the plan, but did tell the publication the company is interested in the area, saying, "Additional infrastructure for the Internet is good for users and there are a number of proposals to add a Pacific submarine cable. We're not commenting on any of these plans." Communications Day also noted that Google has advertised to hire people who would "be involved in new projects or investments in cable systems that Google may contemplate to extend or grow its backbone."
Google has long been buying up data communications capacity. Its search engine works by making copies of nearly every page of the Internet in its own data centers. That requires Google move no small amount of data around the world on a regular basis. And its new plans to deliver applications over the Internet will use even more bandwidth.
Dave Burstein, the editor of DSLPrime, who tipped me off to the CommDay report, explained even though there is a lot of unused fiber capacity across the Pacific, there are few players, and prices are seen as unusually high. He adds that there is a glut of cable-aying ships, so the cost of building a new link to Asia has come down.
This new move puts Google in competition again with Verizon, which has fought Google's approach to the new wireless spectrum auction in the United States. Verizon is part of a group of Asian carriers that is building a $500 million cable between the United States and China.




Technorati :

Thursday, August 30, 2007

Nokia web service offers music, maps, games

PB : Md Moshiur Rahman sponsored by www.careerbd.net
BEIJING, Aug. 30 (24hoursnews)-- Nokia on Wednesday launched a series of Web services under the brand name Ovi that allows users of its phones to download games, maps and music directly to their cell phones.
The Finnish Mobile phone maker's new Music Store is in direct competition with Apple's iTunes offering. Although Apple's iTunes is less expensive, Nokia's service is a huge step forward in accessibility.
IPhone users have to download songs to their computers, but Nokia's Ovi users can download songs directly to their phones. Ovi will be available in Europe in the fourth quarter of 2007, but at the present there is no timetable for entering the United States.
The company has struck deals with the world's four biggest music labels, Universal Music, Warner Music Group, EMI and Sony BMG, and some of the largest game makers, including Electronic Arts and Gameloft for its game store, N-Gage.
"The services unit will, in terms of sales, be extremely small in the beginning," said Nokia spokesman Kari Tuutti. "But that's really the future we see for Nokia, to be able to develop our business around offering services to people."
Tuutti said there are already more than 900 million people in the world with a Nokia mobile phone in their pocket, who he expects to replace with an Ovi-enabled phone in the next two years. Customers can't download Ovi to their current Nseries phones, they have to upgrade to the new models being launched this autumn.
Nokia is launching worldwide the 350 euro (476 U.S. dollars) N81, and a larger capacity version of its N95, with 8 gigabytes, for 550 euros (748 dollars) in October. Later in the year there will be some lower-priced models with the Ovi software, including the Nokia 5310 XpressMusic, for 325 euros (442 dollars), and the Nokia 5610 XpressMusic for 300 euros (408 dollars). Nokia is hoping that other rival phone makers will want to use its Ovi software, but it's doubtful they will at first.
The first Nseries handset with the capability to access Ovi, Finnish for "door," will be the N81, a slider phone that comes with 8GB of memory. The device has been slightly optimized for gaming, with thumb buttons on either side of the phone's face, a bit like a PSP or Nintendo DS, so users can play games with two hands instead of one thumb. It also comes with a navi-scroller that is touch sensitive, similar to an iPod scroll wheel.
New N-Gage En Route
Nokia announces launch, games, and fresh details for the brand new N-Gage service.

August 29, 2007 - The N-Gage is reborn. No longer a specific physical device but an online service, the new N-Gage is set to launch worldwide in November. Users with compatible N-series devices can download the N-Gage application directly to their handsets while all future N-series handsets coming out of assembly will have the service already onboard.The new N-Gage, revealed at E3 in 2006, is something of a smart hub, a destination where gamers and community members log in to check out games, download applications, and participate in events like challenges and contests. The service can be likened somewhat to Xbox Live, giving users a central place to download games and demos, engage other players, and check out community features.The new N-Gage is actually part of a new mobile strategy from Nokia called Ovi (Finnish for "door"). The Ovia mantle not only contains N-Gage, but also a new music store and a map service. Nokia Music Store and Nokia Maps are just the second batch of applications announced for the Ovi service beyond N-Gage. Nokia will add additional functionality to Ovi over time.
Nokia has announced several high-profile partnerships for the new N-Gage in recent months, such as Capcom, Digital Chocolate, and I-play. At a London event today, Nokia showed off a spate of new games, including Snakes Subsonic. Games are currently priced in the $8-$15 range.The N-Gage service is compatible with the N73, N81, N95 series, N93 series. Additional devices will be announced in the future.

Nokia Handsets -Tech gift

PB :Md Moshiur Rahman . Sponsored by www.careerbd.net
Alongside of it's Internet foray, Nokia has reportedly unveiled 4 new mobile devices that are optimized for entertainment, music, and games. The N81 multimedia computer has dedicated music and gaming keys that light up when activated, a 3.5-mm headphone connector, and 3G and WLAN connectivity. Besides, N81 is configured to find, buy, manage, and play music and games purchased from the newly-launched Nokia Music Store and N-Gage games service respectively. At Euro 360 and Euro 430, the N81 and the N81 8GB are both expected to begin shipping in Q4 this year. The N95 8GB has all the key features of its predecessor; including a 5 mega pixels camera with Carl Zeiss optics, built-in A-GPS, WLAN, HSDPA, and a 2-way slide, and offers up to 8GB built-in memory. It will start shipping in the fourth quarter for Euros 560 (before subsidies/taxes).
About 9.9mm thick and weighing less than 71 grams, the 5310 XpressMusic phone offers up to 18 hours of music playback and up to 3,000 songs on an optional 4GB microSD card. The phone features dedicated music keys, a 2 mega pixels camera, and a bright 2-inch QVGA screen with up to 16 million colors. The 5310 XpressMusic too will ship in Q4 for Euros 225 (before taxes and subsidies). The other new XpressMusic handset, the 5610 XpressMusic, features a Music Slider key for accessing music with the flick of the thumb. The phone has a black high gloss finish and aluminium side panels, and features a 2.2-inch 16-million color display, and a 3.2 mega pixels camera with auto-focus and dual LED flash. It offers up to 22 hours of music playback, and memory for up to 3,000 songs on an optional 4GB microSD card. Like the other new phones, the 5610 XpressMusic too will ship in Q4 2007 for Euros 300 (before taxes and subsidies). Both new XpressMusic handsets are compatible with the Nokia Music Store.
From Another source
Nokia has entered the mobile music arena, launching its own music download store in Europe along with four entertainment and media-focused handsets.
Finland's Nokia has decided to toss its hand into the ring of digital music sales, announcing today the Nokia Music Store, launching "this fall" in key European markets, and expanding into additional European and Asian markets over the next few months. The Nokia Music Store will offer millions of tracks from major label artists as well as music from independent labels and regional artists for €1 per track, €10 per album, with a PC-based streaming subscription option for €10/month. At least for now, tracks will "typically" be offered in DRM-protected 192Kbps WMA format—thanks in part to Nokia's recent deal with Microsoft to support Windows Media formats.
The Nokia Music Store is part of Nokia's new Ovi brand of Internet services, currently comprised of Nokia's music, mapping, and relaunched N-Gage mobile gaming service. The Nokia Music Store will offer both download-based and streaming music services, and PC-based music services by way of a new Nokia Music PC client, due to be available later this year. The PC client will support reverse synchronization of playlists, enable users to rip a conventional audio CD to both their PC and mobile device simultaneously, as well as automatically sync purchased music between mobile devices and a PC music collection. The Nokia Music Store will also offer dynamic recommendations based on users preferences, as well as genre-based "instant playlists" designed to help users discover (and, of course, purchase) new music.
"The Nokia Music Store brings together a powerful combination of great music and great devices in an easy to use way," said Tommi Mustonen, the head of Nokia's music activities. "You can select from a huge range of music, including local music from your country, and download it directly to your Nokia device. You can choose between purchasing tracks a la carte via your Nokia device or computer, or you can stream an unlimited number of full length tracks to your computer."
Right now, there's no word when (or if) Nokia plans to bring the Nokia Music Store to the North American market.
Of course, a mobile-enabled music service needs music-savvy mobile devices capable of browsing and downloading music content at acceptable speeds. To that end, Nokia also announced four new entertainment- and music-focused handsets for the European market, expected to begin shipping later in 2007 with prices ranging from €225 to €560. The Nokia N81 and N81 8GB will offer dedicated music and gaming keys in addition to Nokia's stable of "multimedia computer" and smartphone functions. The Nokia N95 8GB will sport a five megapixel camera, GPS capability, HSPDA high-speed wireless access, and a mammoth (for a phone) 2.8-inch LCD display—and readers would be correct in assuming the N95 falls at the high end of the new handsets' price range, above. Finally, Nokia is introducing two new XpressMusic handsets, the 5310 and 5610. The slim 5310 sports a 2 megapixel camera, a 2-inch LCD screen, 4 GB of onboard memory (with another 4 GB available via microSD expansion), while the 5610 builds on those specs with a 2.2-inch LCD display, 3.2 megapixel camera, and longer battery life.
Nokia's Music Store—and associated media-centered handsets—may well find a foothold in the European and Asian markets, where high speed wireless networks are less fragmented than the North American market, and high-speed solutions like HSDPA are more practical. And even though Nokia is the world's largest handset maker, it's still looking to diversify its revenue sources as the mobile market matures: a solid content delivery business would be a good step in that direction. It remains to be seen whether Nokia will attempt to offer the services in the U.S. market, or will settle for merely partnering with U.S. carriers—which pretty much all run their own mobile music operations—to distribute North American versions of its media-centric handsets.