Week 10 Performance Report — Operation Dunk 2010

As you may have guessed by the absence of performance reports, Operation Dunk 2010 was stalled for a bit.  I allowed the demands of work and family to get in the way, but we’re back on the beam.  Luckily, the damage wasn’t too great, as I have remained active. 

  • Weight — Up2.5 lbs from week 4 (251.5 from 249)
  • Wii Age — I’m at 37, which is down from my last two reading of 45 and 55.
  • My balance and endurance has been clearly improving.  I believe that stems from my concentration on the obstacle course, as that contains a jumping motion and gives a pretty decent 2-5 minute workout per run.

    Week 4 Performance Report — Operation Dunk 2010

    Making good and steady progress.  I’ve made higher and higher scores on some of the multi-player games, but unfortunately I can’t seem to save and track my ongoing progress on those.  Of course, my son regularly humiliates me on the snowball fights so maybe I should be grateful for no tracking!

    • Weight — Down 4.5 lbs from week 2 (249 from 253.5)
    • Wii Age — I’m still at 55, though the missing week’s age turned out to be 45.

    The balance tests are still a bit of a problem.  I’m standing straighter — my center of balance is just about in the middle now — but I still have some issues w/ shifting my weight.  Also, I find that my right leg is feeling the workouts more.  There clearly was something I missed in rehabbing from my long-ago broken pelvis.

    Week 3 Performance Report — Operation Dunk 2010

    Now that I finally have published that promised post on performance reporting, I owe an update on Operation Dunk 2010.

    • Weight — Down 3.5 lbs from week 2 (253.5 from 261)
    • Wii Age — Forgot to measure this week… will assume I’m still at 55.

    I’m using the games to get some basic endurance and balance improvements underway.  I notice the difference already in the pace that I can keep up in my 1/4 – 1/3 walk to my office from the parking lot. 

    Finally, I’ve started the WiiFit strength training.  This has put some pressure on my balance issues, as the calisthenics in the program need me to balance on one leg often!

    Week 2 Performance Report — Operation Dunk 2010

    I was about to post on how unreliable performance reporting — progress, status, and forecast — is one of the first signs of project trouble.  However, I realized that I hadn’t posted on my own “Operation Dunk 2010” performance, so here goes:

    I’ve made some progress on my weight (the weight reduction KPI is related to the improvement of my jumping capability):

    • Weight — Down 3.5 lbs from week 1 (257.5 from 261)

    Also, I forgot to mention that I’ve started work on my balance and coordination capability with my Wii Fit.  There is a basic balance test that combines with my BMI to give me a Wii Age.  That’s probably as good a proxy for balance and coordination improvements as any, so it will be KPI #2.

    • Wii Age — Down six years from week 1 (55 from 61).

    What is “ready to use”?

    Sorry, Derek, it's prêt-à-l'emploi not prêt-à-porter.

    I’ve found that the packaged application and custom development fields make at least some nod to standard project management practices.  No doubt there are many suboptimal practitioners, but few ignore or openly bad mouth project management.

    Infrastructure — servers, network, and desktop — appears to still be another matter.  I’ve been shocked by how much expediting and Potemkin planning — plans that are only for show — I’ve seen in the last six months.  One of the most frustrating practices is when IT Operations claims victory over useless milestones after having claimed that planning would have done nothing but slow them down.

    Perhaps slowing down would prevent rework and wasted effort (by one’s customers!).  For example, our service provider’s infrastructure team tried to get credit for server delivery.  Except that the server was only powered on and connected to a network — it was far from “fit for use.”  No one could access the network remotely — is it based in a data center after all — and even if they could potential users were then stymied by no log-on being set ups.  Never mind that we couldn’t use the server anyway because it hadn’t been qualified (we’re in a FDA validated industry).

    If you don’t own “fitness for use” as the underpinning of your deliverables, you aren’t managing your projects, you’re playing “whack-a-mole“.

    What will I need to “build” so I can dunk?

    Per my last full post, I want to re-build my capability to dunk.  And to make sure that we all “know what done looks like”, by dunk I mean to dunk a men’s regulation basketball in a 10-foot goal by EOY 2010.   As I broke down the work, there are at least three sub-capabilities I need to have:

    1. Sufficient jumping ability to get my hands far enough above the rim.
    2. Sufficient “ball skills” to dribble or manipulate the ball (so it can be in my hands far enough above the rim).
    3. Sufficient balance and coordination to manage those two capabilities.

    My dim memory of basic physics (see this site on vertical jump power), my awareness of my ever-expanding waistline, and multiple years of rust on my jumping muscles will have me focus on jumping ability first.  This will involve reducing the amount of mass I’ll need to move — KPI #1 — and increasing the acceleration I can impart on that mass (not sure the KPI for this one yet).

    BTW: The KPI #1 baseline is 261 lbs as of 1/4/2010.  I have a Q1 target at home, but I don’t have it handy (probably in the 240′s).

    BI, Deliverables, and Change Control

    This post may court the Business Week curse, but I’m highlighting this story on BI and the recession (here) as the jumping off point for a couple of observations.  Rob T.’s comment in the piece (sorry, but I can’t link directly) notes that:

    Unlike ERP, BI can be implemented step-wise, first in targeted, strategic areas, and then using a broad brush once it’s value has been proven. A wise strategy for this economy is to start small, pick a problem or area where a quick win is possible and attack it with a 60-90 day effort.

    I mentioned this opportunity for short, focused projects in my WSJ interview.  These short cycle times and the nature of BI work pose problems for traditional ERP change control and deliverables definition. 

    For example. what exactly is “done” on a BI project?  The traditional ERP definition of deliverables — focused on processes that deliver tangible, measureable outcomes (e.g., Order to Cash, Purchase to Pay, Hire to Retire) — doesn’t work well for BI.  Typically, reporting projects focused on # of reports, explicitly defined.  Also, in analytics projects those reports or queries often spark more ideas for how data can be cross-tabbed, projected, etc.  Do you always want to be presenting a change orders for new reports?

    What has worked for you when defining deliverables and change control for BI initiatives?  Now if you’ve read this blog for a while, then you can guess that I’m in tune attacking these issues with a capabilities-focused approach to deliverables.  This approach points one in the right direction when defining what done looks like.  However, many PMs find it hard to grok the requirements and required capabilities of analytics-savvy stakeholders and consultants.

    Deliverables, work packages, and the schedule

    This temptation to fix a schedule and get to work is constant in enterprise IT.  It is particularly alluring for any application tied to a SOX-compliant landscape — some governance models only allow two opportunities/year to deliver – where project durations strongly suggest themselves and time is always “a-wasting”.

    Of course, as Glen Alleman reminds us here, starting with the schedule  is wrong.  I won’t recapitulate his post here, but I’ll borrow from his comment to another post which points out the fallacy in this kind of thinking:

    [M]any…process improvement projects have failed, along with Enterprise IT, because the WHY of the effort is not established or well understood. The principles establish WHY we should be doing something. The practices of course tell us HOW.

    This rush to “get working” short-circuits on of the most important functions of a WBS: stakeholder management.  Properly defined deliverables and work packages aren’t simply inputs to the schedule, budget, etc.  If nothing else, a WBS  is the most accessible framework for a discussion with one’s stakeholders that ensures that the what of the project supports the why of the project.  Wouldn’t it be a good idea to make sure that why and what are elaborated and priorities agreed upon — even at just a couple of levels – before getting down to who, when, where, and how?

    Accountability for “soft stuff” deliverables

    Per an earlier post (here), I have a bee in my bonnet about “supporting” deliverables and how to measure their success.  The excellent comments from Glen and Stephen (here) pointed to the answer.  From Glen:

    We are looking for cost savings, efficiencies, and other process improvements. This is the typical work flow process improvement approach. Extend that to the human processes – monetized – and you can define MOP’s and MOE’s [measures of performance and effectiveness respectively].

    In retrospect, it should be obvious that training deliverables — and their measures and incentives — should make such deliverables accountable for the relevant process/solution/project/program success measures.   In other words, CRM call center training should be judged by the how well the call center solution delivers the intended capabilities.

    This approach is not so obvious to many training professionals, largely because many have never been held to the same standard as process owners.   My experience — from observing a particularly savvy SAP customer — is that a few pointed questions do concentrate the trainers’ minds:

    • Are you worried that the training does not support the actual execution of the processes?
    • What do we need to change in our training approach so that it delivers value to the project?
    • Why am I spending money on training if it can’t be held accountable for project/process succeess?
    • Do you still wonder why trainers aren’t better paid?

    All deliverables should be created equal

    OK, that title isn’t exactly what this post is about, but I couldn’t resist the echo of the U.S. Declaration of Independence.  When I posted on promoting a deliverables-oriented mindset a few weeks ago (here), I started a draft post about project deliverables that don’t get held to the same standard as most product-related deliverables. 

    The first group that came to mind were supporting deliverables: training, [added 9 Dec for clarity] organizational change management, etc.   While such deliverables may be tangible, validating the effectiveness of such deliverables is more problematic.  At least in my experience, the tendency in both cases is to focus on production measures (e.g., classes delivered, content uploaded/downloaded) or soft measures such as customer or training satisfaction.

    We wouldn’t accept such fuzzy success measures for our solution deliverables, would we?  Have you all see better ways to measure and ensure the effectiveness of such enabling deliverables?  I have a few ideas and experiences, but I’d like to hear from you all first.

    Follow

    Get every new post delivered to your Inbox.

    Join 2,518 other followers

    %d bloggers like this: