Posts
1145
Comments
890
Trackbacks
1
October 2010 Blog Posts
User Stories Can Be Wrong

Scott Bellware has a good post about how user stories/requirements need to be tested.  Snippet:

Generally, we're taking too long to get stories validated - even in colloquial Agile development. Sending a user story through an entire development process in order to validate it is too much.”

Users can tell you what they want, but they can be wrong about what they tell you, for a multitude of reasons.  This is one of the reasons why people do usability studies, as there is often a significant divergence between how users say they are using software and how they actually use it.  Sometimes they literally forget about things they do to get around errors, e.g. “yeah, sometimes we get that error message but if we hit F5, it goes away".

Spikes, prototypes and simulations (of the iRise or balsamiq or SketchFlow) can be very helpful here (though you have to fight the tendency of prototypes being forced into production as ‘good enough’).

Some of his analysis seems to fall a little too much into the “there are two types of people, those that say that there are two types of people and those that don’t” category, but overall, it’s a good read.  Check it out.

posted @ Thursday, October 28, 2010 3:29 PM | Feedback (0)
Where and What is WebResource.axd?

I’ve always wondered about this.  Scott Mitchell gives the answer here.

Snippet:

WebResource.axd is an HTTP Handler that is part of the .NET Framework that does one thing and one thing only – it is tasked with getting an embedded resource out of a DLL and returning its content.”

posted @ Thursday, October 28, 2010 2:17 PM | Feedback (0)
Funny comment spam

Got this comment recently:

“Thanks for that link it was a huge help as I'm a DNN newbie stuck on a vista machine. 2 things the guide misses out though, you need to add default.aspx to the default docs list in IIS and you need to select the http option in .net Framework 3.0 (Turn Windows Features on or off), otherwise, spot on! ”

What’s funny about it is that it was a comment to my link to a Jefferson Starship video.  And, of course, the commenter’s URL was to some casino site.

posted @ Tuesday, October 26, 2010 10:40 AM | Feedback (0)
ESPN’s Secret Weapon: Analysis you can’t get anywhere else

As the Countdown team prepared for Sunday’s games, someone (Mike Ditka, I think) was asked what some team (Arizona, maybe) needed to do to win a game with a rookie QB (Max Hall, most likely).  Hell, it doesn’t matter the context.  Here was the analysis (near verbatim):

  1. “They need to establish the running game.”
  2. “They need to protect their quarterback.”
  3. “Someone is going to need to make a play on special teams.”

Well, thank GOD, they cleared that up.

ESPN isn’t the only one that does this, but they like to hire ex-jocks because, somehow, they think it allows people like Trent Dilfer to say something like “Well, I played in the league for 14 years and have been watching it for a long time and….” before saying something that makes him look like an ass-hat.

Back when I taught Philosophy, one of the first topics to cover was the most typically committed logical fallacies and how to avoid them.  What Dilfer was committing was the “I’m about to sound like an ass-hat, but I played in the league and you didn’t, and even won a Super Bowl, though only because of Ray Lewis, but you didn’t, so ‘neener, neener, neener’” fallacy.

posted @ Monday, October 25, 2010 7:32 PM | Feedback (0)
Microsoft relinquishes control of IronPython and IronRuby

As described here.

Clock started on how long until someone ‘explains’ why this makes Microsoft eeeeevvvviiiilllllll!!!!!!!

posted @ Friday, October 22, 2010 9:08 AM | Feedback (0)
Review–Danger! Software Craftsmen at Work

Reading through a generally execrable post by Gael Fraiteur of PostSharp fame, I came across a video by David Harvey that took a similar skeptical stance about Software Craftsmanship.

<digression>The reason Gael’s post is so execrable is that it perpetuates the total nonsense that software developers can be split into two groups, those that ‘care’ and have ‘passion’  and those supposedly godawful 9-5 developers that only care about covering their asses.  This is a meme that really needs to be combated and destroyed at every turn.  I know a number of people who are on the Software Craftsmanship bandwagon, and they are in many ways great folks, but I’ve worked with some of them.  They aren’t really all that intellectually superior or more passionate than the people they think they contrast themselves with.  Not even close.  With my massive ego that can be seen from space, I can ‘out-superior’ any of them pretty easily and most of what they think is “Craftsmanship” is garbage.  But I digress.</digression>

I don’t think Harvey’s presentation is entirely successful, but he makes a couple of points that are worth discussing. 

Paraphrasing a bit, Harvey points out the obvious “Being against craftsmanship is like being against World Peace or kittens.”  Moreover, “there is nothing new about craftsmanship, it’s been around since there has been software development.”  The latter is a healthy point to make, as it pushes back against the current Manifesto driven people who think they are pushing something new.

But though I think Harvey was being somewhat facetious when talking about “World Peace”, it is an analogy that is important.

It’s all well and good to “pull a John Lennon” and sing songs about giving peace a chance and what not.  It’s a nice sentiment and all.  But it sort of runs against the reality of the world in thinking that sentiment is going to change all that much, or that you should think having a nice feeling but completely naïve sentiment is in and of itself a good or useful thing.

It’s basically idiotic to think that, for instance, the Middle East conflict or the conflict between India and Pakistan are conflicts that exist simply because all of the various participants don’t understand that you should give peace a chance.  The world is just a little bit more complicated than that.  While idealism has always been a force that can push people and cultures in directions that wouldn’t be possible otherwise, idealism is almost always, in and of itself, a failure.  Actual reality, actual context, trumps naïve idealism.

To push the analogy, it’s all very well and nice to push an idealistic view of current software development practices as being of craftsmanship against ass-covering, but it’s an unrealistic picture of actual reality.  The idea that (generally misguided) views of software development deficiencies will be solved if only we can get enough Software Craftsmen involved is not only idiotic, but a waste of time.  Actual reality, actual context, trumps naïve idealism.

But continuing to push the same analogy, just because you don’t think that chanting “Give Peace a Chance” at a bed-in is all that useful in solving world conflicts, doesn’t mean you don’t believe in finding peaceful solutions to global conflicts.  Similarly, just because you don’t think that introducing SRP into every possible software development practice will make software development practice perfect, doesn’t mean you don’t believe in making software development better.

It is possible to practice good software development using DataSets, or not doing TDD, or <insert blah blah here>.  Against the backdrop of Idealism, it will always seem bad, or just “that sort of thing that 9-5 people do”, but that isn’t the case.

I’m not sure which is more likely to be solved earlier, the conflict in the Middle East or the ‘conflict’ in software development.  The former actually matters, the latter is generally not that important.

But I think it is important within software development to make clear that the 'monoculture’ of Software Craftsmanship ™ is dangerous.

posted @ Thursday, October 21, 2010 10:31 PM | Feedback (0)
Does this 16 GB of Ram make my butt look big?

image

One of the advantages of moving completely 64-Bit is that you can finally use all of the memory you can install on a system (32-bit Windows always hit the limit around 3.5 GB, depending on the machine, due to memory addressing blah blah blah…..I remember when I used to have to set up memory management so that I could play whatever version of Ultima or whatever I wanted to play at the time which required that extra 12 MB you had to squeeze in somewhere…I vaguely remember running some version of Novell DOS at one point just because I could find the extra memory…..old old old).

It was less than $500 to load up my new Dell 8100 XPS Whatever it is I have, so I figured, what the hell.

In case anyone is curious, the CPU usage is due to Firefox.  I have somewhat…unique usage patterns with Firefox.  I currently have 17 windows with an average of 10 tabs each open (I’m WAAAAYYYYY behind on my reading) which takes 1.5 GB by itself.  Largely due to Flash, I think.

posted @ Thursday, October 21, 2010 7:41 PM | Feedback (1)
Diverse.NET–This is good why exactly?

So, as I mentioned, Microsoft released NuPack, which lead Rob to write a post, which lead Ian to write a post, which various people have commented on one way or another.

This is another one.  Read stuff in order to make some/most sense of this.

I’m actually surprised it took as long as it did for one of the usual suspects to take the stand that Ian took.

Grossly paraphrasing, some of the themes include:

  • By introducing NuPack, Microsoft has unnecessarily harmed the OSS ecosystem around .NET package managers, including Horn, Nu, and OpenWrap.
  • In general, Microsoft does not do enough to promote “from the community” solutions to things, preferring to NIH their own instead.
  • When Microsoft does this, it hinders the growth of the community.
  • In general, the .NET community doesn’t really like to “engage with a diverse ecosystem over a monoculture.”
  • This is a bad thing, and (probably) only Microsoft has the ability to promote a diverse ecosystem.

There’s more (and less), of course, but that’s enough for now.

I’m going to go ahead and ask the obvious question (which Ian tries to answer, but not satisfactorily in my mind).  Why is having a diverse ecosystem so important?  Why is it a good thing in itself?  Given the sort of bland (and largely mindless) promotion of diversity (in the school, in the workplace, etc.), it might seem odd, even controversial, to question the value of diversity.  But, let’s leave the large philosophical and political questions aside.

Ian’s answer is this:

There does seem a lack of willingness within the .NET community to engage with a diverse ecosystem over a monoculture.In the long term this is not good for us. It does not drive the kind of innovation a successful development platform needs.

The first point (“lack of willingness”) is largely, I guess, true.  The second point (“this is not good”) is highly debatable.  The third point (“does not drive…innovation”) is demonstrably false.  The implied fourth point (that .NET is not a successful development platform) is utter nonsense.

A huge part of all of this has to do with OSS itself.  Just as diversity is supposedly a good thing in and of itself, so is OSS (which is what drives people like Rob to suggest adopting an Open Source program, which is a fine suggestion for people who are interested in that sort of thing).  The problem is that there really isn’t any reason in the abstract to believe it, not necessarily anyway.  Simply looking at the practical results, having open source alternatives has sometimes been a major positive, sometimes not so much (contrast the existence of Linux and what it has done to push not only Microsoft but other companies as well with the existence of Star/Open Office and what is has meant to spreadsheets, namely nothing).

.NET is absolutely a successful development platform.  There is simply no believable response otherwise,  and since Ian has never, in my readings of his blog posts,  been one to utter unbelievable responses, I have to believe he meant to say something slightly different.  Maybe that there are aspects of Ruby (everybody’s current love) and Java and their communities that he thinks would make .NET more successful.  Even that is questionable.

But the larger question is why having a diverse ecosystem is so important that Microsoft itself should (in some vague sense of ‘should’) promote it.  I think how one feels about this depends largely on how one defines “the community” and, ultimately, how one feels about OSS in general, in a couple of different ways.

If you come from a perspective that OSS efforts that are not lead/driven by Microsoft have an intrinsic value, then, obviously this diversity is important.  This is why I don’t think there has been much complaint about Microsoft’s involvement with JQuery, because Microsoft is just one contributor (albeit a large one).  With NuPack, at least initially, I think the suspicion is that Microsoft, to all intent and purposes, is and will be the only main contributor, and this suspicion is, I think, largely based on Microsoft’s historical attitude towards OSS.  Surely(don’t call me “Shirley”), they simply want to crowd out other alternatives.  Why can’t they, for instance (as one commenter on some blog somewhere suggested) promote NHibernate instead of funneling all those resources into EF, or why couldn’t they have promoted Monorail instead of creating ASP.NET MVC, etc. etc. etc.

One can see a wish/hope in this perspective that Microsoft “pull an IBM.”  Though it is hardly the only reason, or perhaps not even the main reason, why Linux took off in the corporate environment, IBM’s embrace of Linux certainly didn’t hurt, to put it mildly. 

And if one thinks of “the community” from this perspective, then NuPack can look like a step back.

But that isn’t really the .NET community.  The actual .NET community includes the vast majority of people who are fine with the monoculture, for one reason or another.  The innovation that is occurring throughout .NET in non-Microsoft lead projects either isn’t relevant or off-limits for a variety of reasons.  Having worked at two of the “too big to fail” financial institutions named in a patent lawsuit recently, I understand that in a least one of them, the legal fallout was that the use of OSS has to be justified and officially approved.  I strongly dislike not being able to use NHibernate, as it isn’t approved.  Everybody knows that Enterprise Library sucks as an alternative (and while EFv4 actually isn’t all that bad, .NET 4.0 is still in various stages of corporate adoption).  But, guess what?  Microsoft can provide patent indemnification if you use their tools.  And, guess what?  Large corporations, especially of the “too big to fail” financial variety, are the defendants in these lawsuits.  From a business perspective, they (rightfully) don’t give a rat’s ass as to the technical merits of NHibernate vs Enterprise Library vs. EF or of Monorail vs. ASP.NET MVC or even of Nu vs. Nupack.  

Summary

So, to reformulate my original question, should we promote a diverse ecosystem within .NET?  It depends.  There are those who have accepted the idea that OSS is intrinsically valuable, and so will always find Microsoft’s efforts lacking.  There are those who have accepted the idea that OSS is intrinsically bad.  One can make a reasonable argument that this was Microsoft’s position (or at least Ballmer’s position) up until a few years ago.  I think that people like Scott Guthrie, Phil Haack, and Scott Hanselman live in a space where they can promote OSS within Microsoft, but that space has its limits.

I think that IBM was more easily able to embrace Linux because they are, not only but in large part, a hardware company.   Microsoft (besides their keyboard and mouse stuff) is a software company.  Unless and until they decide to open source Windows and/or Office itself (which I think we can agree will be never), there will always be limitations to their embrace of OSS.  This means that the dream/ideal that Microsoft will take an active role in promoting alternative ‘community’ solutions will probably always fall short in the eyes of those who view OSS as intrinsically valuable.  I don’t see how that will change.

From an individual perspective, I think that if you feel OSS is intrinsically valuable, taking Rob’s advice concerning adopting a particular OSS project is probably something you should consider.   There is always the risk that Microsoft will enter the space of that project, but there are risks in everything.

Diverse.NET?  Sure.  Within limits.

posted @ Monday, October 18, 2010 6:12 PM | Feedback (0)
(Vaguely) Technology Related Signs One is Getting Older

I’m sure there are many others, but these come to mind:

You remember when MTV used to show music videos.

You misspell something in a text message, and immediately send a correction, as if the person receiving the text message is so much of an idiot that they can’t possibly figure out what you meant.

You find yourself thankful that when you were growing up, there was no way that you could have texted a sexually explicit picture of yourself to someone you really thought you loved at the time.

posted @ Sunday, October 17, 2010 9:46 PM | Feedback (0)
NoSQL Growing Pains

FourSquare suffered a near half-day complete outage of their site due to problems with their MongoDB implementation.  You can read the details here.

I won’t comment much on the actual outage, since the details are readily available.  It looks like a ‘standard’ outage that one might be familiar with when dealing with other backend systems.

It does point out that, while NoSQL implementations are heralded to be able to scale to ‘internet size’ (whatever that means), and which RDBMS sites supposedly can’t match (which is a myth, but that’s okay), running any large site will often lead to operational difficulties.

Does this invalidate NoSQL in any way?  No, not really, instead, it’s just a reminder that real-world events can ‘override’ theoretical notions in a hurry (e.g., that NoSQL doesn’t have to face the same sort of difficulties that RDBMS-based systems have dealt with for years).

posted @ Friday, October 15, 2010 9:13 PM | Feedback (0)
A potential downside of Continuous Improvement

Something that the alt.net kool-aid drinkers and Software Kraftsmen people talk about is the notion of ‘Continuous Improvement.’  I will go ahead and shock some people by stating for the record that I am generally in favor of continuous improvement (I know, who knew?).

An obvious potential downside of Continuous Improvement is that people sometimes confuse ‘Change’ with Continuous Improvement.  Doing something different because you’ve read a blog post of why doing Y instead of X is better, doesn’t actually mean Y is better than X.

But, let’s leave that aside.  Let’s assume that you have decided as a developer to do Y instead of X, and Y really is better than X. 

When you are in a mindset of continually improving, it can be very difficult to resist the urge to refactor a code base that does X to do Y.  You look at how you did X, and you just know it would be better to change it to do Y.  Since refactoring is a key part of being a decent developer, learning when you need to resist refactoring is important.

I wrote a code base a few months ago, doing X.  In the interim, I’ve come to believe that doing Y is better.  In the abstract ideal, refactoring the code base to do Y is the right thing to do.  But doing X worked.  Refactoring it to Y won’t make the code base meet its functional requirements better (since they are already met).  It might, in certain scenarios, make potential future work easier to do.  The time it takes to refactor a currently working system has to be balanced with an estimate of what improvements in future potential work will actually occur, and, in my experience, this is hard to do.   Why?

Within Kanban, and other concepts, there’s a notion of ‘local optimization’, where (paraphrasing crudely), it is easy to make the mistake of thinking that improving the development process is an overall improvement.  As developers, it’s hard to think otherwise.  Whenever you discover a development improvement, it can be very invigorating.

But consider…suppose it takes a week to develop a code module, and then takes a week to validate that code module through your QA process.  If you find a development improvement that allows you to develop a code module in a day instead of a week, what can happen is that you’ve now increased the flow of development work into their queue.  Suppose, as often happens, that someone is tracking the time it takes for completed development work to be validated by QA.  You’ve now screwed up that tracking, and so some management report that used to report a smooth transition between development and QA now reports that QA isn’t doing their job (or whatever).  By making things better, you’ve made things worse.

Developers like to reduce friction (well, some do).  It’s hard to suggest that they shouldn’t.  But, you have to keep in mind the overall software development flow.  Ideally, since your development of code modules now takes a day instead of a week, those developers should work on helping the QA team improve their processes so that validation can be reduced to a day as well.

Work on continuous improvement.  Learn to differentiate between doing something different and improving.  Figure out how it impacts all parts of the overall software development flow, and adjust accordingly.

posted @ Wednesday, October 13, 2010 10:43 PM | Feedback (0)
A danger with practicing TDD

 Jimmy Bogard has a post up about how to write effective UI tests.  You should read it, as it’s a really good post.

Though it isn’t really part of his point, I want to ‘hijack’ something that he says for my own purposes:

“For about 3 years, I wrote absolutely horrible UI tests.  These tests worked great for a while, but like most people that tried their hands at UI tests, found them to be utterly brittle and unmaintainable in the face of change.

And, like most people, I would mark these tests explicit or ignored, until I ditched most or all of them.  I’ve heard this story over and over again, from folks that wrote lots and lots of UI tests, only to see their investment go to waste as the time and effort to maintain these often-breaking tests far outweighs any benefit gained from full, end-to-end UI tests.”

Jimmy is talking about UI tests, obviously, but I think what he says applies to TDD in general.

As any reader of my blog knows, I think TDD sucks.  To be clear, I think that developing without TDD sucks also, so there’s a cost/benefit analysis that needs to be done in any event, but one of the things that has been common in almost all ‘real-world’ experience reports about using TDD is that it sucks, for a (usually) long period of time, and then the practitioners figure out ways to make it suck less.

I think that there is something specific about TDD which causes this, and is yet another reason to avoid it.  And I think it also applies to UI testing that is improperly focused.

Suppose you are building a framework, especially a framework where you don’t have any realistic idea of how that framework is going to be used.  In this case, use TDD.  Test happy paths, test odd paths, test bizarre paths.

For everyone else, where you are building an application, don’t use TDD.  Test at the level of the application, which means using Context/Specification style testing.  What does your app actually do?  Prevent odd and bizarre and stupid paths at the application level, and then test what happens on the happy path.

Do not try to force TDD on teams that aren’t experienced with it, because what will happen is that the team will produce horrible and brittle tests that are unmaintainable.  If you’ve ever been in a situation where this sort of thing happens, it’s totally discouraging.  It spreads the idea that there’s something wrong with testing.

Don’t use TDD.  Move beyond it.  Please.

posted @ Wednesday, October 06, 2010 11:07 PM | Feedback (1)
NuPack released

There's all sorts of stuff out today about it, but the 'official' announcement can be found here.

Special award goes out to Jeff Olson for the first snarky comment about it (in the comments to the announcement), and his use of 'auto-fellate'.  Snippet:

"Released "only" as a VSX package? Really? Only supporting 2010, no less? I guess that's great for MVP bloggers, community evangelists and other assorted MS Employees who like to auto-fellate on all the Great Things they do for the community, but there's still a *huge* number of enterprise devs out there in the wild still on VS2008 and 2.0/3.5 ....Bonus points for talking about the problem-space as though NuPack were the only possible solution, to date. Kudos."

Awesome.

posted @ Wednesday, October 06, 2010 12:18 PM | Feedback (1)
How to manage user stories?

Scott Bellware has a post about user stories in which he discusses a very obvious, but not obviously manageable, issue. 

<digression>Since being ‘kicked to the curb’ so to speak from the MVP C# program, it’s easy to dismiss a lot of what Scott says as sour grapes.  Given our past lack of sympatico, so to speak, it would be easy to for me to dismiss a lot of what he says as someone grasping onto Ruby as a reaction to particular events.  But, I think that would be a mistake.</digression>

When using a tool/language/whatever approach like I use with SpecFlow, especially in situations where the developer is also the business analyst, it is very easy to get tied to the particular code implementation that something like SpecFlow encourages. 

User stories, as well as the business requirements that drive them, are ‘organic’ in nature, in the sense that they are driven by the immediate understanding of the people involved in determining them.  In other words (more or less), they can and should change on a real-time basis.  It’s well and good to create BRD documents as a starting point, but what tends to happen is that those requirements become some unalterable things that don’t match up with the fact that as you develop software, they need to change.

Glossing over the details wildly, when using SpecFlow, the user stories are embedded in code.  But since you want every0ne to be able to adjust user stories, there’s a gap there.  When a business user wants to change a user story based on new requirements, or whatever, there’s a gap there.  How to match up the actual intent of a user story with what needs to be in code?

I don’t know what the answer here is.  With the projects that I have control over, it’s (more or less) easy to make my code-embedded specifications match my user stories.  But, how else to handle them otherwise?

posted @ Tuesday, October 05, 2010 11:35 PM | Feedback (0)
A Week in Manhattan

My dad worked in Manhattan for something like a dozen years, and lived in a building that had free guest rooms (awesome!) so I’ve been to NYC dozens of times, but since he retired and the free room and board went bye-bye, I hadn’t been back in 7 years or so.  Things are going well, and I had tickets to the Porcupine Tree concert at Radio City Music Hall (see post here), so I decided, what the hell, let’s make it a really expensive week long vacation.  And it’s easier to post than send emails sometimes.  As a solely personal post, those looking for technical content can safely skip it.

Driving into Manhattan

I’m not quite John Madden like in my dislike of air travel, but I like long drives.  Perhaps it is because as a kid, I lived in Canada while all of our relatives were in Michigan or Missouri, and so I got used to it.  It helps me to clear my mind and think deep thoughts (that’s a joke).  In any event, I like them and so drove out from Chicago to NYC, stopping off in Pittsburgh to catch a Pirates game for the hell of it (they won, which doesn’t happen often, and featured a post-game fireworks show, which included at one point fireworks being shot off to the dulcet tones of the Cookie Monster singing “C is for cookie.”  I guess you had to be there.

<digression> Satellite radio is fantastic for long drives, especially on weekends during any sports season.  The Sunday I drove into Manhattan, I had the joy of listening to the Steelers beat the Titans (which is hard not to do when the opposing team has seven turnovers, though the Steelers tried, by going into a prevent defense and giving up a late touchdown and an onside kick…).  I’m going to go out on a limb here and say that the Steelers’ radio broadcast crew has to be at the top of the list for “Worst Broadcast Team, any sport, any medium.”  I don’t even want necessarily to know who the announcers are, and let them live on in their anonymous horrific-ness.  The lead announcer had a remarkable inability to follow what was happening on the field (“I think the QB just fumbled the snap, I’m not sure, it sort of looked like it, and an incomplete pass….not a sack!”), properly identify the players making plays (“And that was a heck of a tackle by James Harrison, he nearly took his head off!!!  James Harrison is proving once again that he is not only one of the greatest defensive players of our generation, but possibly up there with the all-time greats….<pause>….no that was, Woodley, Woodley with the hit), or otherwise allow you to easily visualize the individual plays, which is normally considered an important part of the job of a lead announcer.  The color commentator was sort of typical, with the random hyper yelling at odd times, and for no apparent reason.  The best part of the broadcast was the (I assume) ex-jock sideline reporter, who was generally unable to provide any useful information whatsoever.  After one fumble, they tossed it down to him…’So, were you able to see who recovered the fumble?”…”No, I’m blocked here by the people on the sideline, so I couldn’t see anything, but once the offensive line starting hooting and hollering and putting on their helmets, I figured something good had happened.”  His best line, when commenting on the heat during a lull, was something like “It’s so hot here, it’s like you could smoke a turkey in the Sahara!”  Um, well, typically when smoking meat, you want to do a slow, low…wait a minute, what? 

It was three plus hours of pure broadcasting fantastical-ness.  At least I got to listen to the broadcast of a game played by Charlie Batch, the greatest QB of our era.  But I digress</digression>

Driving into the city, I discovered that you could actually send and receive text messages while in the Jersey Tunnel, which was a surprise (don’t text and drive kids), but the real fun came once I got into the city.

Like I said, I’ve been to the city many times, and driven into it often as well.  But, since I’ve gotten a GPS-enabled vehicle, I’m pretty much totally dependent on it, and rather lazy.  I didn’t bother to print out directions or anything, I just plugged in the address of the hotel and let er rip.

One problem is that some of the streets in Midtown were blocked off to traffic.  They opened them up later in the week, so I don’t know if it was a special event, something they do every weekend, or what, but some of them happened to be streets that GPS really wanted me to use. 

Then, and I’m not sure how this is even possible, but it did happen, the GPS had the directions of cross-streets reversed, and so kept asking me to drive the wrong way down one way streets, which isn’t a good idea in any city, much less than Manhattan. 

At this point, I was kind of screwed, since all I had bothered to look up is that the hotel was on Broadway, and somewhere between 45th and 48th.  Finally, I just found a city garage and parked there, and moved it to to the hotel valet parking the next day.

Note to self: have a backup plan for when GPS screws up.

Staying in Manhattan

A pretty common refrain I’ve heard from other people who’ve visited the city is that the hotel rooms tend to be glorified walk-in closets, which isn’t surprising.  The smallest hotel room I’ve been in was in hotel, with a king sized bed, and then maybe 3-5 feet clearance around it.  And a bathroom.  So, I was expecting something like that, but hoping for better.

Much better.  The Mariott Marquis ain’t cheap, but the room was worth it.  King sized bed and enough room for a lounge and a sofa.  And over looking Times Square.  Expensive, and valet parking was more expensive, and ESPN HD occasionally locked up on the tv, but what the heck.  It was worth it.

Walking in Manhattan

Some people might be appalled at this, but you could argue that I spent a week in one of the greatest cities in the world, doing essentially nothing.  Having been to the city so many times, I’ve been to most of the museums, at least all of the ones I care about (I have to admit, and this will shock some people, I’m not necessarily that much of a museum guy).  And I don’t like theater shows that much (I already have culture coming out the ying-yang, obviously), and there was nothing playing that would make me really want to see anything (they made a musical based on the music of Green Day?  Seriously?  Wow, sign me up….).  So, I did what I’ve often done, and that is walk, walk around as much of the city as possible, stopping in shops and just enjoying the sights.

For instance, the first full day I was there, I took the subway down to Battery Park and from there, walk up through the business district to Wall Street, then head east over to the South Street Seaport and Pier 47.  From there, it was back across the city to the West through Trinity Church and over to the World Trade Center site, and a view of the new Freedom Tower, then over to the World Financial Center.  After that, it was back across town and then up through Chinatown and Little Italy (which was in the beginning of the Feast of San Gennaro), and then a walk all the way up Broadway back to the hotel, in time to relax a little bit before dinner (more on that below).

On other days, I crisscrossed the city, starting at 32nd St and 8th Ave and heading East to Park Place, then up a block to 33rd and back West.  Depending on which street it was, I might go more East or West.  I wanted to go to the UN, but the General Assembly was in session.  I did get to see one of those all black vehicle motorcades as they jetted some diplomat across town while automatic-weapon toting security stood watch, which you don’t see everyday.  Anyway, I did this crisscrossing all the way up to Central Park South, and on another day, took a two hour walk through most of Central Park itself.  I also walked through Greenwich Village, Soho, and Tribeca.  On another day, to get off my feet for a while, I took the 3-hour Circle Line boat tour around the Island, which I hadn’t done in quite a while.

When doing a walking tour like that, it’s really important to have comfortable shoes.  Being in shape helps (note to self: get in better shape).  And, let’s just mention one word..chafing.   Let’s move on.

Eating in Manhattan

The other main thing I did was eat.  I’ve never been to culinary school (though I know someone who graduated from CIA), but I did work as a cook through much of college.  These days, I tend to watch two things on TV: sports and the Food Network.  Because of this there were various places of chefs I’ve heard about, and I used OpenTable to get reservations at a bunch of different places that I had wanted to go to (since I didn’t confirm till about a month before I was going, I had to shot at getting into the main places of anyone, except one).  There was a risk of having full multi-course meals everyday, but walking 10+ miles a day helped to increase the appetite and keep the calories off.  Most of them.  Here’s the places I hit:

Lupa

Lupa is one of Mario Batali’s places, serving up rustic Italian food that was delicious, and, relatively speaking, cheap.

  • Prosciutto di Parma
  • Clams with Frugola and Basil
  • Ricotta Gnocci with Sausage and Fennel
  • Braciola
  • Sweet Corn Gelato

The only purely ‘casual dress’ dining place I went to (though I’m pretty sure every place and someone in jeans, etc.), it was a great way to start the week, the night I drove into town.  The prosciutto smelled as good as it tasted and the clams dish was something I could make easily enough at home, nice flavors.  The braciola is one of those things that in theory is easy enough to make: take pork shoulder, stuff it with whatever, wrap it up and then braise it in tomatoes and wine for a week and a half (or whatever) till it’s practically melting.  And the sweet corn gelato was one of the most surprising dishes of the week, a definite winner. 

Overall, very good, and easily the cheapest of the week, half of the most expensive meal.

Bar Americain

Bar Americain is one of Bobby Flay’s places, and was probably the least impressive of the trip (though that’s pretty relative).

  • Three Shellfish Cocktail Sampler: Lobster-Avocado, Crab-Coconut, Shrimp-Tomatillo
  • Gulf Shrimp and Grits, Bacon, Green Onions, Garlic
  • Fulton Fish Market Cioppino
  • Bourbon Praline Profiterole

If you watch any of Bobby Flay’s 97 shows on TV, you know he’s known for big bold flavors, which is why the meal was the most surprising.  The Lobster-Avocado cocktail didn’t have any avocado that I could tell, and the Crab-Coconut had no discernible coconut flavor (though the mango was good).  They weren’t bad, of course, but didn’t really jump out.  The Shrmip-Tomatillo was quite good though.  Similarly, while the shrimp and grits was good, it was sort of bland, not very pronounced.

Apparently, the Cioppino was made in SF and then flown out or something.  I kid of course, and I’m not sure I necessarily would have noticed (I like long drawn out meals, and just drinking some wine and thinking my deep doctorate thoughts (joke again)) , but both the waiter and the manager came to the table to apologize that they were having difficulty ‘finding’ my cioppino.  Couldn’t find it?  Did it miss a connecting flight in Dallas?  Anyway, it wasn’t that late, and more than made up for the appetizer’s blandness, a spicy winner worth the wait.

The ‘loser’ of all the dinners, it was still rather enjoyable.

BLT Prime

BLT Prime is one of the steakhouses from Laurent Tourondel, this was the most expensive meal of the trip.

  • Seafood Platter (Oysters, Little Neck Clams, Crab Claw, Shrimp, 1/2 Lobster)
  • Wagyu Ribeye
  • Cheesecake with Blackberries and a Cornmeal Crust

Since I cook so much steak for myself (less than I used to, damn Cholesterol), I tried to avoid having steak every night, but decided that this would be the night for it.  I went pretty classic here, a separated surf n’ turf.  The seafood platter was good.  Well prepared and tasty.

Was the steak the greatest steak I’d ever had, the Wagyu head and shoulders above anything (save Kobe beef from Japan perhaps) I’ve ever tasted?  Probably not.  Were the sauces I chose (horseradish and chimichurri) a revelation?  No, not in and of itself.  But combined with a glass (or two) of a really fantastic Cabernet, this was one of those “savor each bite” indulgent, f%&k it I’m enjoying this meals.  Small bite of steak, small sip of wine, enjoy.  Wait a minute.  Rinse and repeat.  I can’t say how long I stretched it out, but at one point the manager came over and asked if everything was all right, or if I needed more sauces.  No, I’m fine, thanks.

A meal worth tipping well over.

Bar Boulud

Bar Boulud is one of the many places that server classic and classy French food from Daniel Boulud, probably the best overall meal I had.

  • Terrine of Slow Cooked Spiced Leg of Lamb, Eggplant and Sweet Potato
  • Spanish Iberico Belotta Ham
  • Corn and Mussel Chowder
  • Aïoli – Olive Oil Poached Cod, Louisiana Shrimp, Mussels, Garlic Dip, Quail Eggs & Vegetables
  • Black and White Boudin Tasting
  • Venezuelan Chocolate Grand Cru Tart, Cocoa Leaves, Ivory Crémeux, Raspberry-White Chocolate Ice Cream

Somehow, I got the cross-street on this one wrong, so was momentarily confused.  Then, when I arrived, I sort of noticed by didn’t pay attention to the fact that the hostess put down a second menu at my table, so the waiter didn’t speak to me for 15 minutes until asking if I wanted something to drink while waiting.  And the initial cocktail was tiny.  Not exactly an auspicious start.  But from there, it was terrific.

The terrine is one of those things that any good French chef can probably put out in his sleep, but it’s something you don’t eat everyday, and the taste was outstanding.  The imberico was not quite as good as the prosciutto as Lupa, but that’s quibbling.  The corn and mussel chowder was terrific, the Aioli so good, I actually ate the vegetables, and the boudin just one of those things to enjoy.

Just a great French meal.

Morimoto

Luckily, the $30 an ounce Kobe beef steaks were off the menu, making the choice of the chef tasting menu at Morimoto’s an easy one.

  • Tuna Tartare
  • Fluke Carpaccio
  • Albacore Tuna Caeser Salad
  • Oyster with Foie Gras in Miso
  • Sushi course
  • Wagyu Filet and Garam Masala Lobster
  • Chestnut Cake with Green Tea Sorbet

Morimoto’s was the hardest to find, though in retrospect it should be pretty obvious that a building fronted by very large colorful curtains is probably not a bank.  The most visually distinctive place, I was happy that the ‘omakase’ menu was a recommendation.  The tuna tartare and Sushi were just ‘basic’ well executed Japanese food.  The oyster with foie gras was probably the best single taste of the week, and the Indian couple on their first date at the table next to me was envious of the Masala Lobster.

One obvious ‘problem’ with a meal like this was pairing wine with it.  No problem, there was an ‘omasake’ drink pairing.  For every dish.  Every single dish.  Sure, they weren’t huge drinks, but that’s 7 courses and 7 drinks (plus the obligatory pre-meal martini).  Let’s just say it’s a good thing Daddy took the subway.  The best of the lot was a ‘smoky’ sake that tasted like scotch. 

Another memorable meal.

Day of Lunches

The day of the concert, I didn’t have any dinner plans, so I had a few lunches.

  • A hot dog in Central Park
  • A couple of tacos at Taco Taco in the Upper East Side
  • Freshly shucked clams in Little Italy
  • Pepperoni Pizza in Little Italy
  • Beef Fried Rice in Chinatown

Just a whole bunch of (relatively) small lunches, including authentic Italian and Chinese.  Well, the fried rice was small in the sense of price.

Summary

Since I’ve been to NYC so many times, I briefly wondered why I hadn’t done this sort of thing before.  Oh, that’s right.  It’s expensive.  Most of the previous times I visited, I was broke.

I took in one of the comedy shows they ‘hawk’ in Times Square ($10 plus 2 drink minimum), which was marginal.  I also hit a couple of lounges, including the hotel lounge and the BlueFin.  It always surprises me that places shut down in Times Square by 2AM.  South Beach never shut down, though, of course, if you know where to go, Manhattan is always open.  I’m not as young as I used to be, and I dodged a cougar-attack one evening (if you know what I mean), so it was probably just as well.

If I did this every year, would I enjoy it as much?  Well, hell, yes, obviously.  Am I likely to do it again soon?  Probably not immediately, there’s always too much work and other things going on.  I have an eye on a west coast hockey tour in the spring.  You never know when you will have the chance to take this sort of vacation.

Overall, it was a great trip.  Especially since the first week back, I had to deal with emergency production migrations.  But I digress.

posted @ Monday, October 04, 2010 8:45 PM | Feedback (0)