The case for corporate Excel training investment

Howdy folks. Jeff here again, with my musings on the kinds of things you might put in a business case for a corporate Excel training program.

I think corporate-wide Excel training is too valuable to leave to a Learning and Development team and/or to chance (assuming an organization is large enough to have an L&D function in the first place).

  • L&D likely don’t know that much about how business users are misusing Excel, so how can they offer generic training at arms-length to remedy said misuse? At the same time, they must support numerous departments, with training being just one aspect of what they do (the other is meetings. Very important meetings, I’m told) and with Excel being just one of many programs that users need training in. So L&D simply can’t specialize in Excel training to a level that’s really going to make a difference at the coal face.
  • The efficiency dividend from training accrues to the units of the people being trained. So units themselves are directly incentivized to invest if it will get them a more optimal outcome…regardless of whether fostering increased human capital falls within that unit’s core job/mandate or not.

So instead of business units saying “It’s not our job to train…it’s theirs” I think they need to be thinking “We’d rather someone else do this, but we’re prepared to fill in the gaps ourselves if it helps our people to solve their business problems in ways that increase quality, improve productivity, and provide higher returns on investment.”

But what would a corporate Excel initiative look like? And how would you sell it to an organization that isn’t aware of the problems with the status quo?

I thought I’d have a crack at putting together a generic business case that addresses these questions. Not put together with any specific organization in mind…rather, this is something gleaned from practically every organization I’ve ever worked with/for, as well as from my time spend moderating a few Excel help groups and such forth. Also, consulting with an Insolvency Practitioner Norfolk can provide valuable financial advice for your business.

Love to hear about suggested rewrites, omissions, etc in the comments.

Executive Summary

  • We believe that additional investment in Microsoft Excel-based training will help users solve their business problems in ways that increase quality, improve productivity, and provide a good return on investment, while decreasing some key data-related business risks. Aside from this, posts like northwest registered agent review can provide valuable insights and recommendations to enhance the efficiency and compliance of business operations.
  • Consequently we propose to instigate a training program above and beyond that currently offered by Learning and Development that is primarily focused on educating users on how Excel’s deep feature set can be better utilized to solve common business issues while increasing data integrity, utilizing approaches that users would otherwise be unlikely to be exposed to; and highlighting routine practices that currently get in the way of increased efficiency.
  • Given the large number of users involved, one-on-one diagnosis and training covering all users is simply not possible. Consequently this initiative primarily revolves around increasing the ability of users to self-diagnose issues and inefficiencies with their existing spreadsheet applications and practices, and to educate them on how to access resources and knowledge that will greatly assist them to self-remedy issues wherever possible. Given the resources available for this initiative, raising awareness of current bad practices and alternatives is possibly the biggest efficiency gain and risk-reducing opportunity that we can offer, at the least cost.
  • Given Excel is the tool by which many people realize/express their commercial analysis, data visualization, business processes, and logic, we believe that in teaching people to use the tool itself better we can also foster increased competence in these fundamental skills/attributes.

Problem Statement

Currently our Excel user base largely consists of users largely at the basic level, with a much smaller contingent of intermediate users. Almost all uses are largely self-taught, often with little exposure to the vast wealth of constantly evolving best-practice resources that are available both offline and online. Consequently, despite being motivated to use Excel as efficiently as they can, even intermediate users barely scratch the surface of Excel’s productivity tools. At the same time, because some of Excel’s features are so very easy to use, those same features are also easily overused to the point of misuse. Consequently:

  • The majority of users spend much more time manually formatting and manipulating data than they need to, often with little or no backwards auditability.
  • Sometimes our approaches to analytical problems involve much more complexity than a more suitable approach, with little or no additional benefit or confidence eventuating as a result.
  • Many of our business processes hosted within Excel spreadsheets are convoluted and unclear. ‘Mission-critical’ spreadsheets are often bloated, difficult to audit, and subject to catastrophic spreadsheet failure – possibly without users realizing.
  • Modelling practices and spreadsheet practices across the organisation are inconsistent, and there is little or no peer review or development in place to ensure people use the Excel tool smartly, wisely, and efficiently.
  • Best practices and learning are not perpetuated in a formalized way throughout the organization. The emphasis of our expert users currently remains fixing things that are broken, rather than using education to avoid bad practices in the first place.

While our Learning and Development (L&D) unit offer a number of appropriately pitched courses in place focusing on new/basic users, these are functionality-centric, rather than being business-specific. Consequently, such courses often don’t expose users to alternative practices. At the same time, L&D staff are unlikely to be fully cognizant of the vast amount of quality free resources available online, as well as some paid offerings that may prove more cost effective than the traditional course vendors that we have previously used.

As a result –

  • Most people keep using the very small subset of Excel’s functionality that they know about as a hammer on problems that aren’t nails but screws
  • We don’t insist on nor foster nor seek to measure incremental improvement of our analyst’s skill-set.
  • We allow users to build mission-critical applications every day in Excel with little or no discipline .

The status quo incurs a very real opportunity cost to the organization, given that –

  • Advanced users can often do something in a fraction of the time that it takes an intermediate user;
  • The automation capability of Excel is staggering;
  • It’s not actually that hard to turn basic users into intermediate ones, and intermediate users into advanced ones, so that they can do things in a fraction of the time they currently do, if not automate it completely.

Desired state

We propose to train our analysts to become not just better Excel users, but also more effective analysts. Note that this initiative isn’t just about the use of Excel itself, but related disciplines such as effective data visualization, business process design, and logic. Given Excel is the tool by which people express their analysis, data visualization, and logic, then if we teach people to use the tool better, in doing so we will also give them some pointers in thinking harder about what they are using the tool to achieve.

The desired state that we seek to arrive at would be demonstrated by the following outcomes:

  • Our people would leverage more fully off Excel’s rich feature-set in order to achieve better analytical outcomes with much less effort. At the same time, they should be more fully cognizant of common and potentially avoidable spreadsheet design flaws and pitfalls.
  • Our people would better judge the appropriate amount of time and precision required for any given analytical task. They will be less likely to over-analyse/over-build, and more cognizant of the diminishing returns that often accompany increased complexity of approach.
  • Mission-critical spreadsheets/templates would be more robust, simpler to maintain, and user-friendly, as well as easier for successors to understand and maintain. This should result in lessened business risk, via decreased risk of incorrect models and catastrophic spreadsheet failure; Increased model auditability; and a more consistent approach to spreadsheet development and use across teams.
  • Once our people better realize the capability of the Excel application, they begin to actively look for other efficiency opportunities where they can leverage their new Excel skills, and share their approaches.

Approach

This initiative is largely focused on increasing human capital – which historically has been more the domain of the organization’s Learning and Development team rather than our own unit. However, we propose entering into this space due to the following factors –

  • Reduction of L&D’s technical training capacity;
  • The opportunity cost of not realizing some fairly low-hanging efficiency dividends;
  • The cost of procuring external training, and the risk that such training would not have the long-term positive impact on current practices that we seek;
  • The increasing time pressures and demands on staff – meaning increased barriers to utilizing external trainers from both a financial and a time perspective; and
  • The fact that we already have a strong interest in developing training materials that encompasses Excel, Modelling, VBA, and Business Process Improvement.

The primary outcomes this initiative will deliver are –

  1. Increased efficient/effective use of one of our main business productivity tools – Microsoft Excel – and greater awareness of current sub-optimal processes and approaches.
  2. Education of users to self-identify efficiency/risk issues with existing spreadsheet-based processes/approaches and give them avenues and resources to address these issues.
  3. Facilitation of increased peer-to-peer learning and peer-review opportunities between Excel users throughout the wider organization.

In doing this, the initiative will take a multi-pronged approach:

  1. Remedial mitigation of mission-critical spreadsheets/processes
  2. ‘Best Practice’ efficiency/effectiveness education
  3. Peer-to-peer user group
  4. Identification, creation, and dissemination of useful resources/templates
  5. Evaluation of additional tools/external training opportunities

These are covered in more detail below.

1. Remedial mitigation of mission-critical spreadsheets/processes.

We will work directly on selected mission-critical spreadsheets and spreadsheet-bases business processes in conjunction with business owners to identify design issues and prioritise remedial action.

2. ‘Best Practice’ efficiency/effectiveness education

We will deliver multiple training sessions/workshops on best-practices covering modelling, spreadsheet risk mitigation, and using the Excel application more effectively and efficiently. This will also draw on ‘lessons learned’ from the above step. This is critical given that several of the spreadsheets prioritized for investigation in the preceding step were constructed within the last 6 months. This highlighting that we have an on-going issue, and not just a problem with legacy spreadsheets.

Sessions will cover –

  • How recent versions of Excel have extended its capability from mere spread-sheeting into business intelligence.
  • Data organisation: Many things in Excel go from challenging to easy simply by changing how source data is organized.
  • Understanding how Excel calculates, and avoiding calculation bottlenecks and overhead by wisely choosing/using formulas for efficiency.
  • Leveraging pivot tables for aggregation and reporting; and utilising Excel’s dynamic table functionality to cut down on complexity.
  • Using advanced functionality to augment user interfaces by incorporating slicers, advanced filtering, VBA (aka macros), and dynamic SQL queries (for simplified data retrieval and processing) into spreadsheets/models.
  • Conditional formatting and data validation techniques that can help to better validate user input.
  • Tools (both free and commercial) that can help users to work more efficiently in the Excel environment.
  • Troubleshooting and streamlining existing models that users have inherited.
  • How users can get free help, advice and inputs from online resources and forums.
  • Modelling/analysis best practices.
  • Data visualization best practices.
  • Spreadsheet development best practices, including case studies covering lessons learned from recent work.

Session attendees will also have access to associated hand-outs/workbooks with supporting content and links to further in-depth resources. Each session will be repeated multiple times on a rolling basis to facilitate/encourage maximum patronage. Managers will be encouraged to actively promote attendance at these sessions and potentially include them in staff development plans if appropriate.

3. Peer-to-peer user group/help forum.

We will set up and facilitate an internal Excel User Group – potentially with a supporting software forum/message board. This will be used to –

  • Provide peer-to-peer learning and support opportunities. A wider peer-to-peer support network that extends beyond just our unit alone will provide much wider coverage than me alone can offer.
  • Identify/evaluate mission-critical spreadsheets across our unit that potentially impose significant business risk and/or administrative burden, and provide owners with some options to restructure/re-engineer them accordingly.
  • Provide attendees with more hands-on exposure to tools/tricks that they can use to re-engineer their own spreadsheets in need. Encouraging all users to utilize the skills of identified expert users on a consultancy basis for advice and peer review as needed – perhaps via the peer-to-peer-based user group outlined above.

4. Identification, creation, and dissemination of useful resources/templates

We will identify/create a useful repository of quality free training resources (including construction of additional in-house resources and ready-to-use macros/templates where warranted) – that supports further in-depth professional development and the same time reduces our dependency on paid external courses. This will draw heavily on the large amount of free resources and training materials available online that equal and in many cases surpass paid external training content in terms of providing learning outcomes.

We will publish a weekly ‘Productivity Hack’ article on the organizational intranet home page suitable for users of all levels. These articles may also reference a number of the outstanding productivity resources published each week on the Internet by the worldwide Excel Development/Training community (including blog posts, technical articles, training videos et cetera).

5. Evaluation of additional tools/external training opportunities

We will work with IT to evaluate new additions to Excel such as PowerPivot – a free extension to Excel that allows users can crunch, filter, and sort millions of records with very little overhead, as well as incorporate multiple data sources easily into PivotTable-based analysis, using an interface they are already familiar with. PowerPivot extends the capability of Excel to the point that it might reduce the need for some of the other apps we currently use to amalgamate discrete data sources, such as SAS.

We will also offer our assistance to Learning and Development to help them identify/rate any external training providers they will use going forwards.

14 thoughts on “The case for corporate Excel training investment

  1. Jeff,

    The basic premise you advance is undoubtedly a good one, but before you progress it further, i think you need to ask yourself whether this is going to fly, or just be shot down in the early stages. some suggestions:

    Determine the enthusiasm within your organisation?
    Have you got a sponsor or advocate at a senior level?
    Could you do this in stages and demonstarte the value – such as building standard templates?
    Would your staff turnover derail the benefit (in other words are you just training people up who will then leverage this into a better apid job elsewhere)?
    What is this going to cost (always an important issue)?
    Is it likely that IT will support you?

    We have an L&D department – who are very good – but they determined that tehre was littele benefit in going this far – instead, they have a series of simple to follow modules which demonstrate various key topics. We also have a number of standard templates as well.

    Hope this helps.

  2. I’d prefer a more question-based approach: ask a user about her/his tasks and what (s)he would like to improve/get rid of.

    Your approach looks too tool-driven to me. Other tools might be more efficient (improving information ergonomics, improving administrative logistics, automating Word, Outlook, Powerpoint, etc.).
    I wouldn’t aim at enlarging the amount of Excel gurus in a company.

    Try to reduce the user’s necessary input in procedures to decision making and inputting new intelligent data (that are the result of expertise); leave the rest to any kind of automated process, be it in Excel, Word or C++, etc.

  3. Hi Alan. Yes, that’s the hard part. This is only going to fly if you can make decisionmakers understand that there is a very real opportunity cost of bad use of business tools, and get them to realise that while such an initiative might not seem as urgent as some other things on the work programme, the sooner that the ‘desired state’ is arrived at, the better placed analysts will be to deliver urgent things on the work programme. Maybe you can only ever sell something this big if you are preaching to the converted and/or you have a track record within an organization of realizing some big efficiency dividends by ‘fixing’ someone’s bad use of Excel. And stages probably is easier to sell than a ‘big bang’ approach like this. But I think an intergrated approach will give bigger ROI.

    And quite possibly some analysts wouldn’t stick around. But maybe it’s better to have higher turnover of people doing things smarter than lower turnover of people that are doing things dumber. Plus in a big organization, some analysts are going to use their new-found skills to get another job higher up in that same organization.

    The cost of something like this would be at least one FTE, and that FTE is going to be flat out on this. Problem is, that FTE’s boss probably has a whole heap of other things that they also want delivered, so might not want to or be able to ring-fence that FTE to do this and nothing but this.

    Most of the places I’ve worked at, IT are too busy running the business’ outward-facing machinery to get involved/interested in whether analysts in say HR or Finance are making efficient use of tools used by in-house facing analysts. But I don’t see that lack of IT support is an issue.

  4. Hi snb. One-on-one is definitely a better way to uncover need. But what if there’s just one of you and a couple of hundred or thousand of them? One of the most satisfying things I’ve done recently is to deliver an hour-long training presentation to a group of about 50 analysts. Compared to the presentations that they sat through before and after mine, they looked spellbound when I showed them how easily you can do complicated things instantly when you leverage off things like slicers, Tables, VBA, and SQL. Within that presentation, I gave them access to things like my Filter Pivot routine and others that I’ve posted here recently, and have had many emails back saying “You have cut hours off my time each week, and that’s just from one thing you showed me. Now I’m looking at what else I’m doing inefficiently”

    Yes it does focus almost exclusively on Excel. Largley because that’s because Excel is what I know. A lot of the inefficient use of Word and PowerPoint is mainly around writing pointless reports and putting together pointless presentations, as far as I can see…not so much inefficient use of the tools themselves. That said, if you could teach someone to use Word’s Outline feature as a way to help them to think about structuring content, then right there you would add value to the organisation.

    I don’t understand why you wouldn’t want to enlarge the amount of Excel gurus in a company. Why not? Anytime some non-guru comes to see me, they walk away with a much better approach and a very real time saving. That said, my draft business case also focuses on getting people to use external Excel gurus via blogs, help forums etc.

    On your last point, couldn’t agree more. Problem is, changing anything on that level at a big bureaucratic company is bound to result in concussion from banging one’s head on the desk. :-)

  5. Following on from the comments above, I added another point to the Exec Summary:

    Given the large number of users we have, one-on-one diagnosis and training covering all users is simply not possible. Consequently this initiative primarily revolves around increasing the ability of users to self-diagnose issues and inefficiencies with their existing spreadsheet applications and practices, and to educate them on how to access resources and knowledge that will greatly assist them to self-remedy issues wherever possible. Given the resources available for this initiative, raising awareness of current bad practices and alternatives is possibly the biggest efficiency gain and risk-reducing opportunity that we can offer, at the least cost.

  6. I wouldn’t mention any negatives (inefficiencies, bad practices, ill-devised , misuse, etc.); don’t judge people on abilities that were not required when they were being hired. Stress the positive aspects, ‘improving’, ‘smoothing’, making life easier, let the machines do the work, etc.
    It shouldn’t look like a crusade against badly functioning employees, because they don’t.

  7. SNB, sounds like you’re saying “Don’t put a problem statement into the business case”. I personally wouldn’t fund anything to do with training unless I could be convinced that something is broken. That’s the problem…most people (including users and managers) don’t know that something is broken. What’s broken here is that inefficiency imposes a significant opportunity cost. A real dollar amount.

    The abilities are required when they were hired…almost all jobs in say HR or Finance or Data Analysis calls for Excel, and often ‘Intermediate’ Excel at that. But hiring managers either don’t know how to test for this, or think that Intermediate Excel means ‘can do a VLOOKUP or IF statement. So we get people using formulas like this, which I found in a spreadsheet just a few minutes ago:

    =IF(J4=1,$F4*J4,IF(J4=0.9,$F4*J4,IF(J4=0.8,$F4*J4,IF(J4=0.7,$F4*J4,IF(J4=0.6,$F4*J4,IF(J4=0.5,$F4*J4,IF(J4=0.4,$F4*J4,IF(J4=0.3,$F4*J4,IF(J4=0.2,$F4*J4,IF(J4=0.1,$F4*J4,0))))))))))

    Yep, that person knows an IF statement, alright. Must be advanced. But this does exactly the same thing:
    =$F4*J4
    —edit— I should point out that J4 had data validation that only allowed the entry of 0,0.1,0.2,0.3,0.4,0.5,0.6,0.7,0.8,0.9, or 1.0…which is why these formulas are equivalent.

    This person is functioning badly when it comes to Excel. Don’t get me wrong…they are not a bad person, and their formula works. But how long did they spend putting that gem together? I’m betting hours. And is this the only case of them being suboptimal? I’m betting not. If working with data currently takes up 30% of this person’s time, I’m betting that good training will get that down to 10%.

    Inefficiency like this costs real money. And mission-critical spreadsheets badly constructed are a real risk that can cost really realy big money. Pointing out the significant risk of practices like this, and the significant opportunity costs of the status quo seems like the clearest way to drive home to budget holders that accepting the status quo is very very costly.

    I have seen well-paid Senior Analysts spending 40% of their time cutting and pasting. I saw one guy who copied transactional data every month into a workbook, made a Pivot, then filtered that Pivot on each category, printed the output, retrieved the printout, and then entered the aggregated totals into another spreadsheet. What a waste of his salary.

    I’ve seen models that take 2 minutes to recalculate after each and every change and that are used by large groups of analysts. What a waste of hours of aggregated time each week for users. I’ve restructured a few of them so that they do the same thing in under a second. Instant savings.

  8. @Jeff

    ..’ I’ve restructured a few of them so that they do the same thing in under a second …’

    But how much time would it have cost if they each had to reinvent these models themselves ?

  9. The model owners couldn’t restructure these models…the models were so far gone that it took someone who really new what they were doing to unwind the logic and fix them. So yes, sometimes it takes an expert to fix legacy inefficient stuff after the fact, and I’ve got that under point 1 under the ‘Approach’ heading in my post. But an ounce of prevention is worth a pound of cure, which is why I’m so big on education.

    Successful mitigation has to be largely based on education. Typically there’s just one of us and a few hundred or thousand of them. I already have a backlog of identified problematic models in the reservoir that is our network drive to fix. God knows how many unidentified problematic ones are lurking below the water line…In the case of the models I mentioned above, no-one diagnosed a problem until I came across them in use quite by chance, because noone new that such slow calculation was completely avoidable. They didn’t know they were suffering from a disease. I only stumbled across them because I just happened to be walking past someone’s desk, and saw the incredibly slow recalc take place serendipitously. The model users just thought that slow calculation was simply a byproduct of having a complex model. But in actual fact in most cases I’ve seen, performance issues are more often than not because of inefficient spreadsheet design, coupled with use of volatile functions. In the case of one of these models, I didn’t even touch the overall design much at all…rather most of the speed enhancements came from stripping out all the volatile OFFSETS and INDIRECTS and replaced them with INDEX etc.

    But…while model owners or the original model builder don’t have the skills to renovate existing models like this (because fixing a near incomprehensible and undocumented jumble of tens of thousands of formulas is not for the faint hearted)…the model builders can be educated to build new models that are simpler, faster, impose less compliance costs on users, and just as precise to boot. And all model users can be made aware that if a model takes upwards of minutes to recalculate (even when entering data in an entirely new sheet that nothing points to) or has a filesize far, far larger than the raw footprint of the data to the point that models are upwards of 100MB or over without good cause, there’s something wrong with the approach. Then they’ll know they should consult a spreadsheet doctor rather than suffer ignorantly in silence. And they should be warned that ignoring the symptoms and simply switching to Manual calc mode can be dangerous.

    Again, because there’s only so many experts go go around, and those experts don’t know what pain the non-experts are ignorantly suffering, this initiative primarily revolves around increasing the ability of users to self-diagnose issues and inefficiencies with their existing spreadsheet applications and practices, and to educate them on how to access resources and knowledge that will greatly assist them to self-remedy issues wherever possible.

  10. Hi Jeff,

    Those are sound ideas. There are some style points I would change.

    When I try to sell an idea I always try to remember that the last thing I want to do is sell my idea. What that really means is, make sure we present what the audience wants to hear – not what we want to say. For example, take this line:

    “We believe that additional investment in Microsoft Excel-based training will help users solve their business problems in ways that increase quality, improve productivity, and provide a good return on investment, while decreasing some key data-related business risks”

    Consider:
    “We can make real improvements to quality, productivity, and profit through leveraging our in house business experts and making minor investments in their technical skills.”

    Your idea is to improve technical skills (specifically in XL). The audience isn’t looking for that. They’re looking for profit. Talk to that motive. Make it first. Drop the “We believe” stuff. That sounds unsure. And if WE’RE unsure, our audience won’t be sure of us.

    Find concrete examples. Make it real. Pump up the upside. For example – MEASURE THIS: “•Advanced users can often do something in a fraction of the time that it takes an intermediate user”
    then say it like this: “We assigned a task to Max and Less. Max finished in X minutes. Less took Y hours. Thus Max produces X/Y times as much as Less but earns only 5% more. Annualized, the ROI is $$$$. The difference is training.” Do the same for the downside.

    Think of objections. Why would each audience member be threatened by this? IT – Control. Finance – Costs. Executives – Failure. Be ready with responses.

    Shorten it up. Think sound bites. Get interest first and keep the “How To’s” in our back pocket. If the audience is interested, they’ll ask.

  11. I liked the Hatmaker advice. Hes doing the equivalent expert training in presenting business cases that Jeffs business case is trying to do for Excel users. Same raw materials just organised in a different way that produces better outcomes. I guess if you say you're going to run training courses in Excel, what managers see in their mind is a bunch of people sat at a screen typing in list of numbers. Maybe get some of those manager in the training room and show them an un-refactored spreadsheet and then the re-factored one so they can see the speed difference. Their greedy little minds will thinks theyve found the $$$s themselves.

    I dont tend to use really big and complex spreadsheets but would be interested in any top tips for avoiding the pitfalls that lead to slow re-calcs. PLenty of ‘instead of this, do this’ examples with background reasoning for the alternative.


Posting code? Use <pre> tags for VBA and <code> tags for inline.

Leave a Reply

Your email address will not be published.