Monday, 01 September 2008

Google seeded a paper comic book to some people recently, to present and describe their future web browser (or you might just think of it as the web browser of the future), which is called Google Browser or Chrome.


So, what's the story? Making the browser more stable, more usable, more secure. At first glance, it looks like a strong starting point for the future of Internet browsers. Written from the ground-up from scratch and with the experience of several years of past browser platforms to learn from, Google has addressed many of the main concerns in today's browsers.

Now the only question is: When will we get it? I will be watching here to see if something shows up. Hopefully it's soon!

UPDATE: The release date is tomorrow (Tuesday, September 2, 2008) - More info and link to screenshots here.

A variety of technologies are incorporated into the Chrome design that improve on common browser weaknesses. The key improvements fall into the areas of stability (memory allocation and management, process management), some incredibly cool javascript environment enhancements (in the form of a new, open-source javascript engine), a bunch of user experience improvements and significant security changes.

And, it's all open source. That's right - Anyone (including other browser makers) can leverage the work done in the Chrome project and can contribute or modify to meet their own needs. Good move, Google.


Pretty exciting stuff. It will be fun to see what comes next, and when.



Add/Read: Comments [0]
IT Security | Safe Computing | Tech
Monday, 01 September 2008 10:57:24 (Pacific Standard Time, UTC-08:00)
#  Trackback
 Friday, 29 August 2008

As mentioned the other day, LinkedIn today released their new Groups features. Groups are one of the most popular features on LinkedIn, despite the limited feature-functionality provided for groups on the web site in the past.

The new features include a searchable contacts roster (search by name, company, or other keywords such as specific areas of expertise), which is accessible to all members; and discussions with email-digest notifications (which are configurable by individual group members). A few screen clips of the new functionality are shown below, and LinkedIn has published an informational page describing the new functionality.

Notification when you sign in that your managed group now has new features:


The new tabs available reflect the new functionality:


Choose your notification email delivery preferences for discussions:


Write a new discussion topic for the group:


Recent discussions list:




Add/Read: Comments [0]
Tech
Friday, 29 August 2008 01:08:33 (Pacific Standard Time, UTC-08:00)
#  Trackback

Vidoop Labs has a dream:

The dream is to see Identity baked into all browsers. Just imagine opening your web browser and then selecting your Identity Provider (IDP) the way you select your default search provider. The benefits are numerous; never type in a username, never look for a login button/page (you are authenticated when you land on a domain), no phishing/MITM (the browser can do domain and SSL cert validation). You fire up your browser and authenticate (or login) similar to the way you log in to your computer every time you turn it on. The difference is you get to choose your provider and can take control of the data you safeguard, store and share on the Internet.

I could get into that.

Vidoop is a Portland, Oregon company that has built some interesting technology around OpenID. I really like the idea of OpenID, and I have a couple OpenIDs of my own that I use on various sites. But OpenID is not exactly perfect. It's still relatively young, and from the usability standpoint it needs improvement. The identity and authentication requirements of the modern Internet demand some additional features and capabilities that OpenID doesn't deliver (and you can argue that it shouldn't). By combining openID with other technologies (such as Information Cards and other strong-auth offerings) and improving usability for end-users, it could become a widely-adopted, used and trusted standard, or part of a broader one covering strong authentication and identity protection/assertion in a commonly-accepted and deployed package.

Vidoop's Luke Sontag today posted an announcement that the company's newly-formed Vidoop Labs has fired up a community project called IDIB (pronounced "Eye-Dib"), which aims to improve on the OpenID usability model and make it stronger at the same time. They've released a developer preview of IDIB in hopes of involving people and getting your input and feedback.

From the Vidoop announcement:

Over the past few years we’ve seen the adoption of OpenID continue to increase but the work that we’ve done as a community to develop this technology has only just begun. Looking at the landscape of OpenID adoption, its clear that there are several key factors inhibiting adoption, but two that we want to focus on today, namely usability and security in the browser.

It was almost two years ago when the Firefox 3.0 roadmap was
announced and OpenID was mentioned as a new component to the platform. The Mozilla Firefox team looked to members of the OpenID community to step up and provide guidance on what exactly we imagined identity in the browser looking like, but we failed to mobilize and answer their call.

In light of that missed opportunity,
Vidoop Labs has been working hard over the last several weeks to produce a prototype that we intend to use to initiate a wider discussion about OpenID in the browser and what it might look like.

And the current developer preview (which is open-source) is just a beginning. Imagine leveraging Information Cards (such as one would use with Microsoft's CardSpace, or the similar open-source offerings for Mac and Linux) in the cloud, and being able to use OpenID - one logon for all your web sites - confidently, securely and with proper security protection.

The Internet needs a good, strong, reliable, usable and secure standard technology to solve the issues related to user names, passwords, single sign on and identity protection. IDIB looks like a serious and positive attempt to start the journey directly down that path.



Add/Read: Comments [1]
IT Security | Tech
Thursday, 28 August 2008 23:18:19 (Pacific Standard Time, UTC-08:00)
#  Trackback
 Thursday, 28 August 2008

I thought I'd present some casual observations I made throughout the day Wednesday on a trip from Portland to Seattle, as well as some newly reported information about the AT&T 3G network that's hit the 'net over the past 24 hours or so.

The back-story here is that I - like many others - have found the reliability and consistency of the iPhone 3G to be less than satisfactory while on the 3G AT&T network.

First of all, it became clear to me over the course of several hours yesterday that the iPhone is not to blame with regards to connectivity on the 3G network. While driving from Portland, Oregon to Seattle, Washington and back yesterday, I had the opportunity to run a whole slew of speed/connectivity test sessions using the iPhone app called "iNetwork Test" (click here to get the free app in the iTunes App Store).

AT&T actually has fairly impressive 3G network coverage from south of Olympia, Washington practically all the way to Seattle, with one or two small gaps in-between where the phone switched to EDGE. Much of the area along that I-5 corridor is rural or sparsely-populated. From a wireless connectivity standpoint, it's a pretty decent area to live in if you're going to be far away from the city.

My experience in using the 3G network along my drive up and down the Interstate can be summed up thusly:

In areas with higher population density, and thus more iPhone (and other device) users, ability to a) connect to the voice network and make calls, b) stay connected to the voice network, c) make data connections and d) maintain data connections was substantially worse. The difference between dense and sparsely populated areas was like night and day.

Where population density was lower, even in cases when fewer bars are displayed on the signal strength icon, voice and data connections were reliable and solid without exception. In contrast, in high-population areas even full-signal connectivity was spotty and unreliable.

I'm running the latest iPhone software, v2.0.2, which both Apple and AT&T have encouraged people to upgrade to. AT&T even sent a text message to all users asking them to upgrade - a first-time action on the part of the carrier.

Some new information, part of which you'll find quoted below, helps explain why I experienced substantially poorer performance in the cities and heavily-populated areas but not in the rural sections of my drive. According to reports, it appears AT&T's 3G radio systems are power-constrained, and are not able to maintain all the connections. The incredible number of iPhone 3G devices on the network - especially in metropolitan and urban areas - is most certainly placing a heavy load on the radios. In addition, iPhone 3G devices that have not been updated to the v2.0.2 software are placing an even heavier burden on the radios from a power-consumption standpoint.

So, there's a power-management problem, as well as a capacity problem. When the network "noise" in the radio spectrum used gets to be higher, the towers have to increase power to try to overcome the noise. You can see how that doesn't work. Eventually the noise keeps climbing and the power consumption at the tower (and presumably on the iPhone as well) goes through the roof.

More towers would increase capacity, reduce power requirements and resulting noise, and generally improve coverage. But that's not something that can be changed overnight.

All of this helps explain why my ability to make calls, connect to the 3G data network and download at high speeds was much better where the network is only lightly used.

The Daily Tech site has a detailed report (and some intelligent reader comments) that describes the cell-site power issues, the problems related to the older iPhone 3G software, and other items. Go to the Daily Tech site to get all the details. Here is a portion of the information, including some text quoted from Roughly Drafted Magazine, whose author was able to get some new details from a source inside AT&T's wireless business describing the power issues and what the iPhone's v2.0.2 software update changes:

Basically the update "fixed power control on the mobile" according to the source. To understand what they're going to say next, you must first know a bit about AT&T's jargon for UMTS -- the technology it uses to deliver its 3G network. In the technology, phones are referred to as user equipment, "UE" for short. The base transceiver station towers are known as "Node B".

With this jargon in mind, the AT&T source explains:
"In UMTS power control is key to the mobile and network success. If the UE requires too much downlink power then the base station or Node B can run out of transmitter power and this is what was happening. As you get more UEs on the cell, the noise floor rises and the cell has to compensate by ramping up its power to the UEs. If the UE power control algorithm is faulty then they will demand more power from the cell than is necessary and with multiple users this can cause the cell transmitter to run out of power. The net result is that some UEs will drop their call. I have seen the dropped call graphs that correspond to the iPhone launch and when the 2.0.2 firmware was released. The increase in dropped calls, (were the result of) dropped calls due to a lack of downlink power."
In essence, the iPhone is asking for a stronger signal than it needs. In areas with lots of users, some or all of whose phones are doing this, calls start to get dropped and signal quality drops. This all follows with the conclusions the media had reached -- the problems were somehow correlated to user distribution and seemed puzzlingly to be both with AT&T's network, and with the hardware.

The source continues:
"The power control issue will also have an effect on the data throughput, because the higher the data rate the more power the Node B transmitter requires to transmit. If the UEs have poor power control and are taking more power than is necessary then it will sap the network’s ability to deliver high speed data. This is one of the reasons why AT&T has been sending text messages to users to persuade them to upgrade to the 2.0.2 software. In a mixed environment where users are running 2.0, 2.0.1, and 2.0.2, the power control problems of 2.0 and 2.0.1 will affect the 2.0.2 users. It is not the network that is fault but the interaction of the bad power control algorithm in 2.0 and 2.0.1 software and the network that is at fault. The sooner everybody is running 2.0.2 software the better things will be. Having seen the graphs the 2.0.2 software has already started to make difference."
Since transmitting lots of data takes lots of transmission power, and transmission power was unnecessarily being raised above that necessary for the use levels on phones, the network in areas of heavy use was unable to handle high speed data.


Add/Read: Comments [0]
Apple | Mobile | Tech
Thursday, 28 August 2008 18:21:35 (Pacific Standard Time, UTC-08:00)
#  Trackback

My first-generation Nikon D70, which I bought the day it was released to the market a few years back, died on me a few months ago. Without a card in it, it won't start, and when you insert a CF card in the slot, the green data-access indicator flashes on and off. If I hold down the Menu button, the menu flashes on and off along with the green LED.

As it turns out, this is a known problem with the original Nikon D70 cameras, and Nikon USA has a service bulletin out on the camera body. They'll repair it free of charge.

So, if you have the same problem, visit this service bulletin page, click on the D70, and you can access a PDF file that you'll need to print, fill out and send to Nikon along with your camera body. Be sure to take your camera strap off and remove the battery, and don't send any lenses or other accessories.

Mine's on it's way to Nikon now - they say the turnaround is five days (plus shipping time).



Add/Read: Comments [2]
Photography | Tech
Thursday, 28 August 2008 15:51:59 (Pacific Standard Time, UTC-08:00)
#  Trackback
 Wednesday, 27 August 2008
Well, this is a little embarrassing. Intergalactic malware has made it's way into the news. A computer virus on the International Space Station. No AV software on the laptops they use, nor (apparently) is there a process of security checks on personal computer equipment like USB thumb drives carried by astronauts being rocketed to the International Space Station.

Granted, the virus in question in this case is pretty innocuous, and apparently other viruses that have made it into space aboard computer gear in the past (it's really quite difficult to mention that in passing) have also been more of an inconvenience than a real security threat.

But imagine a virus that might make its way on-board and do more damage. Not good. It looks like it's time for some effective process and possibly some basic security technology - You know, just in case.

The author of that virus has something new to brag about, though. That's for sure.



Add/Read: Comments [0]
IT Security | Tech
Wednesday, 27 August 2008 20:01:30 (Pacific Standard Time, UTC-08:00)
#  Trackback