Saturday, 28 October 2006
(This discussion refers to Outlook 2007 Beta 2 Technical Refresh)
I'm really not sure how I feel about this.
There was a big discussion about if the feed:// protocol was needed. Personally I've always said I think it IS needed while the RSS Team at Microsoft disagrees.
However, I just noticed that not only does Outlook store it's RSS in the PST (and syncs with the Common Feed Store, which we already knew), but it also registers two new "Protocol Handlers" explicitly for handling RSS feeds - they are OUTLOOKFEED:// and OUTLOOKFEEDS:// with the latter including an "S" for secure feeds.
This doesn't seem exactly fair or consistent. I understand that an enterprise, especially one using SharePoint would want to have folks subscribe to a feed directly into Outlook. However, not only is Outlook creating these new pseudo-protocols that are Outlook-specific, it's also taking over FEED:// as well. We'll see if there's changes in the next RC.
That doesn't seem fair. What if RssBandit started using RSSBANDITFEED://? Of course, any of these aggregators can try to take over OUTLOOKFEED://, although Outlook will likely bork. However, it's the very existence of this custom psuedoprotocol that I find offensive, it doesn't matter it can probably be disabled.
ASIDE: For some reason FeedDemon always warns me that it isn't the default feed reader (i.e. it's not associated with the feed:// protocol, and even though I want it to be the default aggregator, it keeps prompting. This might be a Vista-specific administrative thing, but I suspect Outlook is taking over feed:// also.
You can test these various protocols on your machine by trying each of the following links:
Also, right now, if you click an RSS Feed while running FeedDemon (just using FeedDemon as an example application that eats RSS but also hosts IE7) then IE7 tries to subscribe using the RSS Platform and the Common Feed Store, when really FeedDemon should be getting the subscription request. I know that Nick @ FeedDemon will eventually fix this with some cleverness, but should he really have to?
I'm just unclear on the usefulness thus far of the Common Feed Store. I like the API (inside msfeeds.dll and a few other places that you'll get quietly when you get IE7), even though it's COM-based, and I like that it handles the retrieval and the parsing/canonicalization of the various feed formats. However, it's unclear how I am to administer it effectively. IE7's interface is a little week if you have 400 feeds. There's no shift-select-delete support in either IE or in Outlook 2007 so I can't remove the hundreds of duplicate feeds that have appeared in the last few weeks. I've found the sync'ing solution from NewsGator to be a decent start - as an idea - but the implementation is NOT working well as it's incredibly slow and 10% of my feeds just don't sync.
Rather than blaming NewsGator or Microsoft, I'm forced to ask, is it really this hard to keep my Feeds and Read Status sync'ed between a few computers and a few applications? Apparently it's wicked hard...this leads me to wonder if ONLINE feed reading is where its at.
Apparently my readership thinks so. At least half of you are using online aggregators (or NewsGator sync'ed aggregators which includes NewsGator proper as well as FeedDemon when you're sync'ing feeds).
What do you think? Do you read your feeds online?
Do you like the one-click convenience of FEED://, or do you prefer either using FireFox's clever Feed Reader Chooser, or are you a Right Click|Copy URL|Alt-Tab|Subscribe|Paste|OK type?
We have to change our clocks on Sunday here in the US, but if you're running a server, you need to know about this:
Sunday marks end of daylight savings time
October 27, 2006
[Folks across the country] can look forward to an extra hour of sleep as clocks turn back one hour in observance of daylight savings time at 2 a.m. Sunday morning.
The time shift occurs twice per year in North America: one hour is lost on the first Sunday in April, and the hour is gained back on the last Sunday in October. This policy was enacted by the Uniform Time Act in 1966.
Lawmakers made recent changes that will extend DST four to five weeks beginning in 2007. The Energy Policy Act of 2005 will change DST’s duration from the second Sunday of March to the first Sunday in November. Those in favor of the change said the US will save on energy costs by taking advantage of the extra hour of sunlight during the extended period.
It'll save on energy costs, but I wonder what the IT cost will be. Windows folks can check out http://www.microsoft.com/windows/timezone/dst2007.mspx for more details. Thanks to Tim Heuer and Bill Evjen for the pointers! A test version of the patch is available through support as KB924840.
From that page:
Windows XP SP2 and Windows Server 2003 will require the update. Windows XP SP1 and older operating system versions have passed their end of support dates and will not be receiving the update. Windows 2000 has passed the end of mainstream support and will not be receiving an update without an Extended Support Hotfix Agreement. Find more information about support policies around hotfixes.
So all you Windows 95, 98, ME, and 2000 folks, be prepared to be an hour off until you notice it and change it yourself. I predict 15 minutes until someone writes a freeware utility to fix this problem themselves. Will it be me? Nope, I'm taking Zenzo to the Children's Museum.
Wednesday, 25 October 2006
When did iTunes start sucking?
I rarely Blog Bile™ but I've been an iTunes fan since day one, and suddenly iTunes 7 is the only application that can utterly suck the life out of Windows. It's ridiculously slow. Literally simple things like moving or resizing the window are "Click...wait 2 seconds...drag" operations. I've got a lousy 7034 songs and I can't even scroll or search without pain. It's bad under Windows XP, but it's unusable under Vista. I'm also totally unable to play my protected songs under Vista RC2. I've googled, but I'm not getting a sense that this is a pervasive problem.
Is anyone else seeing this problem? What happened to cause iTunes to fall from grace?
Here's an excerpt from a "mid-level" educational/nutshell whitepaper I'm doing on the new shiny SSL certificates that are coming soon. If you want information from someone who REALLY knows what they are talking about, subscribe to Tim Callan's SSL Blog. Also, watch the IEBlog. If you're running IE7, you can download and install a sample testing certificate then visit the fictional https://www.woodgrovebank.com and see the new certificates it in action.
ASIDE: On a totally different (but, eh, slightly related) note (and I'll blog or Hanselminutes.com about this later), if you're running IE7 and .NET Framework 3.0, check this out.
SSL - Secure Sockets Layer
Every online banking site protects their user’s data while it is in transit on the wire using Secure Sockets Layer or SSL, running one layer below protocols like HTTP and FTP. Many end users are informed enough to look for the “s” in HTTPS in their browser’s address bar and most look for a lock in the browser status bar before sending private data across the Internet.
Early versions of SSL used comparatively weak 40-bit encryption but most sites now use at the very least 128-bit and in some cases, 256-bit AES encryption. Many impose this important restriction by default by allowing only SSL3.0/TLS1.0 over HTTPS.
This screenshot from the Mozilla Firefox browsers shows that the encryption strength of two different banking sites. This dialog is reached by the user clicking on the lock icon within their browser.
In these examples, both sites are using high-grade encryption.
Recently more and more phishers have been successful in fooling the public into giving up personal information with the use of so-called “domain-authenticated SSL Certificates.” These SSL Certificates go through virtually no background check to prove the site is who they say they are. They prove only the domain name, but as the general public rarely clicks on the lock icon to view more information about the company or organization behind a SSL connection, they assume that a secure connection equals a trusted connection. This, of course, is not the case. Unfortunately these SSL Certificates look essentially the same to the browser as one issued by a highly trusted certification authority, thereby causing a phisher’s site to look “as secure” as your bank’s site.
High Assurance or Extended Validation SSL Certificates are a new kind of SSL certificate that will be treated very differently by newer browsers. Internet Explorer 7 will be the first browser to take advantage of this new technology with others like Firefox and Opera very close behind. This standard is being actively developed by the CA/Browser Forum as of this writing and will be referred to commonly as EV SSL Certificates.
To quote from Tim Callan’s SSL Blog at http://blogs.verisign.com/ssl-blog/2006/03/a_new_kind_of_ssl_certificate_1.html:
If every Internet user in the world had a browser that recognized the difference between High Assurance SSL Certificates and traditional ones and if every legitimate site used a High Assurance certificate, then phishing as we know it today would essentially be eliminated.
A lofty goal indeed, but one worth striving for.
When visiting a test Banking Site that has an EV SSL Certificate using IE7, the address bar turns green and a new active lock icon appears showing the name of the organization this site claims to be.
The lock icon toggles back and forth also showing the Certificate Authority that issued the certificate.
If the user clicks anywhere in the secured area of the address bar, the identifying EV SSL Certificate popup is green and shows the user information they can use to make the decision to trust this site or not.
As of this writing EV SSL Certificates are not yet available for purchase, but they are expected within very soon as the standard is finalized. Within a year expect all major browsers to support the standard and within another year most e-commerce users will know to watch for the new browser behaviors when making their decisions. I predict some browsers will have settings that will only allow users to visit sites over SSL that use EV SSL certificates.
Educate your organization about the importance of having an EV SSL certificate when they are ready to be issued, and be prepared to meet the much more rigorous standards that will be expected by the Certificate Authority before they issue one. There will likely be a revised Certificate Authority WebTrust auditing standard (usually called CA Web Trust) that CAs will have to pass before they can issue an EV SSL certificate, and CAs will impose much stricter vetting procedures to verify the company or organization requesting the certificate is who they say they are.
Given the concerns on today's Internet around privacy and control over content, every e-commerce or banking site should be prepared to upgrade their SSL Certificates to EV SSL. There's no downside.
Wednesday, 25 October 2006 20:48:37 (Pacific Daylight Time, UTC-07:00) by Scott | Trackback
My thirty-eighth Podcast is up. This one is a little off the beaten path, but it's a topic that is near and dear to me as I'm a Type 1 Diabetic on both an Insulin Pump and Continuous Glucose Meter - 24 hours a day. I figure since you're all technologists you'd be interested in some of the discussion around how this problem can be solved, mostly using technology. I hope you enjoy it.
We're listed in the iTunes Podcast Directory, so I encourage you to subscribe with a single click (two in Firefox) with the button below. For those of you on slower connections there are lo-fi and torrent-based versions as well.
Links from the show are also always on the show site, although this show had no links to speak of. Do also remember the archives are always up and they have PDF Transcripts, a little known feature that show up a few weeks after each show.
Our sponsors are Apose, /nsoftware, CodeSmith Tools and the .NET Dev Journal.
There's a $100 off CodeSmith coupon for Hanselminutes listeners - it's coupon code HM100. Spread the word, now's the time to buy. This coupon is good for the CodeSmith Professional With 1 Year Premier Support option.
As I've said before this show comes to you with the audio expertise and stewardship of Carl Franklin. The name comes from Travis Illig, but the goal of the show is simple. Avoid wasting the listener's time. (and make the commute less boring)
- The basic MP3 feed is here, and the iPod friendly one is here. There's a number of other ways you can get it (streaming, straight download, etc) that are all up on the site just below the fold. I use iTunes, myself, to listen to most podcasts, but I also use FeedDemon and it's built in support.
- Note that for now, because of bandwidth constraints, the feeds always have just the current show. If you want to get an old show (and because many Podcasting Clients aren't smart enough to not download the file more than once) you can always find them at http://www.hanselminutes.com.
- I have, and will, also include the enclosures to this feed you're reading, so if you're already subscribed to ComputerZen and you're not interested in cluttering your life with another feed, you have the choice to get the 'cast as well.
- If there's a topic you'd like to hear, perhaps one that is better spoken than presented on a blog, or a great tool you can't live without, contact me and I'll get it in the queue!
Enjoy. Who knows what'll happen in the next show?
Tuesday, 24 October 2006
Last year this time, I posted my reading list for that month with the grand idea of posting the list monthly, but it's just such a hassle to get the books input into the post. (Should have used Amazoner, I suppose) However, I remembered, belatedly yesterday that the whole point of writing the Windows Live Writer CueCat/Amazon Plugin (I need a better name) was to make this kind of list. So, here's a partial list of what I've just finished reading, or that I'm in the middle of reading.
- I've just finished reading Stardust by Neil Gaiman and what a fine book it is! I noticed on Amazon that folks who read the kinds of books I read also read Gaiman, so on a whim I just went to my local book store and bought every Gaiman book. I was not disappointed. He definitely has a writing style, while his books can only be described as modern fantasy. Truly a great book by a great author.
- I'm about 300 pages into American Gods: A Novel by Neil Gaiman and it's another that did not disappoint. It's a little more obscure in its references and I've had to look up a few mythological things, but it's a book that is hard to put down. It could be a fantastic movie if someone truly cared enough about doing it right. Do check out his blog as well.
- I'm always trying to Learn Zulu but there's not a lot of books published in the last 10 years on the subject. I keep this one around just to stay frosty.
- Travis has been trying to get me into Vurt by Jeff Noon and it's just not happening. I've been 150 pages into this thing for at least 2 months now and I just can't slog through it. It's so abstract as to be obtuse. I'm hoping it picks up soon.
- I enjoyed The Goal by Eli Goldratt so Chris Brooks recommended Critical Chain. I'm only a few chapters in, but it's already got me thinking.
- As a new Dad, I'm loving John Rosemond's New Parent Power! It's huge, but appropriately broad in scope. I particularly like the "Principle of Benign Deprivation: Give your kids 100% of what they need, and 10% of what they want." That's how I was raised and I think it's a great way to manage things in this tricky American metaculture of acquisition we live in.
- Chris Sells recommended On Basilisk Station (Honor Harrington) by David Weber and I'm about 1/4 of the way through but it's just not gripping me. Not sure why, it just reads so old. You have have a paperback read worse than a hardback with bright white paper?
- I finished a re-read of The Forever War by Joe Haldeman and while there's a whole overly weird Free Love section that reeks of the 70s, the message is clear and while it was a thinly veiled Vietnam War protest novel, it could be read as a thinly veiled Iraq War protest novel. The Time Dilation stuff is always fun, with a great ending to get you thinking.
- I'm about done with Abraham: A Journey to the Heart of Three Faiths by Bruce Feiler that explores the relationship that Islam, Judaism and Christianity have with Abraham, and how things seem to hinge on their differing views of him as a biblical and possibly historical figure.
- I'm really enjoying A Short History of Nearly Everything by Bill Bryson. It is clearly a history book more than it's a Popular Science book, but the author's zest of the topic(s) and the huge breadth of the book really put a human face on the discoveries (unfortunately largely Western) of the last few hundred years and how they relate to the fullness of time.
- I've got Mo reading Kindred (Bluestreak Black Women Writers) by Octavia E. Butler. This is an alternate history book, but more a Time Travel book where the time travel itself is both glossed over from a technical point of view, but also fundamental to the point. A modern Black woman is pulled back into the late 19th century Baltimore and is enslaved by her Great-Great-Great-Grandfather. Another alternate-universe book by way of racial allegory is The Intuitionist about the theoretical first Black female Elevator Inspector. Also recommended.
- You can never go wrong with anything Philip K. Dick writes, so I fall asleep with a re-reading of any of his great short stories like those in The Eye of The Sibyl and Other Classic Stories (The Collected Short Stories of Philip K. Dick, Vol. 5) by Philip K. Dick.
- I thoroughly enjoy Ursula Le Guin's work, and I was particularly pulled into Rocannon's World in this compilation of three novels in one: Worlds of Exile and Illusion: Rocannon's World, Planet of Exile, City of Illusions by Ursula K. Le Guin
- Philip Dick writes a lot of alternative history - what if Hitler won the war?-type stuff. In The Man in the High Castle by Philip K. Dick Americans live under Japanese occupation and explores the relationship between German and Japanese culture.
- I loved Neverwhere: A Novel by Neil Gaiman. I'm only halfway through American Gods, but so far Neverwhere is my favorite Gaiman book. It's set in the world of London Below, a parallel world in the sewers where those we've forgotten go. I won't ruin it for you, just check it out.
Well, the wife and I are off to dinner, it's our 6th wedding anniversary! (We eloped a year before the white-dress-wedding)
Tuesday, 24 October 2006 21:01:27 (Pacific Daylight Time, UTC-07:00) by Scott | Trackback
Monday, 23 October 2006
The MSDN Docs are very careful not to recommend using impersonation it affects connection pooling when talking to databases downstream. The suggestion that one takes care when using impersonation has been in place since its inception.
Know Your Tradeoffs with Impersonation
Be aware that impersonation prevents the efficient use of connection pooling if you access downstream databases by using the impersonated identity. This impacts the ability of your application to scale. Also, using impersonation can introduce other security vulnerabilities, particularly in multi-threaded applications, such as ASP.NET Web applications.
You might need impersonation if you need to:
· Flow the original caller's security context to the middle tier and/or data tier of your Web application to support fine-grained (per-user) authorization.
· Flow the original caller's security context to the downstream tiers to support operating system level auditing.
· Access a particular network resource by using a specific identity.
ScottGu has a good post on how to use declarative authorization to restrict access without impersonation. This works great with Forms Authentication and Custom Principals like we use at Corillian. Here's one of his examples:
1: using System;
2: using System.Security.Permissions;
4: [PrincipalPermission(SecurityAction.Demand, Authenticated = true)]
5: public class EmployeeManager
7: [PrincipalPermission(SecurityAction.Demand, Role = "Manager")]
8: public Employee LookupEmployee(int employeeID)
10: // todo
13: [PrincipalPermission(SecurityAction.Demand, Role = "HR")]
14: public void AddEmployee(Employee e)
16: // todo
There's all sorts of wacky things one can do with impersonation, but it you ask yourself WHY you need it, perhaps you'll find a simpler solution.
One of my bosses always says "Guy walks into support, sez he needs a bigger mobile phone antenna. Doe he need a bigger antenna or does he really want better reception? Don't let your users dictate your solution with their statement of the problem."
Monday, 23 October 2006 18:42:14 (Pacific Daylight Time, UTC-07:00) by Scott | Trackback
Sunday, 22 October 2006
Paul Stovell was watching a talk I gave with Keith Pleas at Teched US 2006 on building your own Enterprise Framework. The basic jist was that architecting/designing/building a framework for other developers is a different task than coding for end users.
One thing that is valuable for context is that Keith and I were playing roles in this presentation. Keith was playing Einstein in his Ivory Tower, the developer who wants perfect purity and follows all the rules. I was playing Mort the realist, the developer who just wants to get the job done. We went back and forth with white slides for Keith, Black for me, each of us declaring the extreme view, then coming together on the final slide with some pragmatic and prescriptive guidance.
Paul had an issue with the slide on Extensibility where I, as the hyper-realist, said:
- If they extend it, they will break it
- Use Internal more
- Seal first, ask questions later
Frankly, I think this is crap.
For goodness' sake, Paul, don't sugarcoat it, tell me how you really feel! ;) Just kidding. He has some interesting observations and (some) valid points.
If you are developing a framework or API for someone else to use, and you think you know more about how they plan to use your API than they do, you've got balls. [Paul]
I mostly agree with this. However, you certainly need to have SOME idea of what they are using it for as you're on the hook to support it in every funky way they might used it. It is reasonable to have some general parameters for how your API should be used. If you design it poorly, it will likely get used in ways that may end up giving the developer a bad experience or even breaking the app.
For example, in a logging service we had a method called ConfigureAndWatch that mirrored the log4net ConfigureAndWatch. It's meant to be called once per AppDomain, and never again. Because it was not only poorly named (since we too the internal implementation's name) and it didn't offer any suggestions (via exceptions, return values, or logging of its own) some users would call it on every page view within ASP.NET, causing a serious performance problem. There's a number of ways this problem could be solved, but the point is that there needs to be a boundary for the context in which an API is used. If we had constrained this more - and by doing that, we think we know more than they do - then some problems would have been avoided.
Scott goes on to give an example whereby he actually made every class "internal" in his API, and waited for users to tell him what classes they wanted to extend, and extended them one by one. [Paul]
This little bit of inspired brilliance was not my idea, but rather Brian Windheim's, an architect at Corillian. We had an application that consisted largely of base classes and developers were insisting that they needed infinite flexibility. We heard "infinite" from the developers, but not the business owner. Brian theorized that they didn't need as much extensibility as they thought, and shipped a internally basically sealed version. When folks needed something marked virtual, they put it in a queue. The next internal version shipped with something like 7 methods in one class marked virtual - meeting the needs of all - when originally the developers thought they wanted over 50 points of extensibility.
The point of Brian's exercise was to find a balance between extensibility, both explicit and implicit, and supportability.
When you mark something virtual or make a class public, as a
developer framework designer explicitly expressing support for the use of that API. If you choose to mark everything virtual and everything public as Paul advocates, be aware of not only the message you send to the downstream developer, but also the unspeakably large combinatorics involved when that developer starts using the API in an expected way.
Cyclomatic complexity can give you a number that expresses the complexity of a method and offer valuable warnings when something is more complex than the human mind can comfortably hold. There are other tools (like NDepend and Afferent Coupling, Lattix and it's Dependency Structure Matrices and Libcheck and its measure of the churn of the public surface area of a framework) that can help you express the ramifications of your design decisions in fairly hard metrics and good reporting.
If you mark all your classes and methods public, be informed of these metrics (and others) and the computer science behind them and acknowledge that you're saying they aren't right for you. Just be aware and educated of the potential consequences, be those consequences bad or good.
Can you honestly rely on people who are "just playing" with a technology to tell you which bits they will need to be extensible 12 months into the future?
You totally can't. When you're designing for Users, you do a usability study. When you're designing for Developers, you need do a a developability study.
Microsoft actually does more of this than most folks think. Sure there's the Alphas, Betas and CTPs, but there's also TAP (Technology Adoption Programs) programs, Deep Dives where folks go to labs at Microsoft and work on new technology and frameworks for a week while folks take notes. These programs aren't for RDs or MVPs, they're for developer houses. If you're interesting, ask your local Microsoft rep (whoever organizes your local Nerd Dinners perhaps) how you can get into an Early Adopter Program for whatever technology you're hoping to influence. They really DO listen. We just came back from a Deep Dive into PowerShell and got not only access to the team but a chance to tell them how we use the product and the direction we'd like to see it go.
Scotts [sic] philosophy, and that of many people at Microsoft (and many component vendors - Infragistics being another great example), seems to be to mark everything as internal unless someone gives them a reason to make it public.
That's not my philosophy, and I didn't say it was in the presentation. It was part of the schtick. The slides looked like this with Keith as Ivory Tower Guy first, then Me as Realist guy, and the "in actuality" slide last with guidance we could all agree on. However, I still think that marking stuff internal while you're in your design phase is a great gimmick to better understand your user and help balance good design with the important business issue of a supportable code base.
The salient point in the whole talk is be aware of the consequences of extremes and make the decision that's right for you and your company. (Very ComputerZen, eh?)
Paul's right that it is frustrating to see internal classes that do just what you want, but simply marking them public en masse isn't the answer, nor is marking everything internal.
Friday, 20 October 2006
Jason Scheuerman from my company has created a PowerShell Cmdlet Visual Studio 2005 Template so you can create PowerShell Cmdlets using File|Add New Item.
In the screenshot at right, I've select File|Add New Item and entered get-thing.cs as the name of my new Cmdlet.
If you want to use this Item Template, drop this zip file into your C:\Documents and Settings\<YourUserNameHere>\My Documents\Visual Studio 2005\Templates\ItemTemplates.
Don't unzip it, just put the ZIP itself in that folder.
You can learn more about creating Cmdlets (they're different from PowerShell Scripts (PS1 files) in that they can integrate more tightly with the pipeline and they can use parameter binding) at MSDN.
There's more about the difference between Cmdlets and scripts in my interview with Jeffrey Snover at Hanselminutes.com.
Friday, 20 October 2006 20:33:50 (Pacific Daylight Time, UTC-07:00) by Scott | Trackback
|© Copyright 2006 Scott Hanselman newtelligence dasBlog 1.9.6288.0
Page rendered at Sunday, 29 October 2006 07:37:32 (Pacific Standard Time, UTC-08:00)
Contact me: =scott.hanselman