Day 3

The closing day of the Gartner BI Conference has arrived. Time to look through the seminar list and pick anything I haven't yet had a chance to gain additional exposure on, or would like to hear from a potentially different angle. I will also be curious to see if the themes noted during the first two days continue into the third.

The day opened with an 8:15am presentation by Bill Holtzmann on Building a BI Strategy. Some of his comments were similar to other speakers. High points include:

  • Your strategy should include:
    • Why are we doing this?
    • What are the processes to be implemented?
    • Who is involved?
    • What will the high-level schedule be, and when?
    • How will we accomplish our objectives?

DO's:

  • WHY: Give your project a business reason as part of its name. Example: "Enterprise Performance Visibility Initiative" – something people can understand and relate to
  • WHAT: Identify the information required, the technology and the architectural options
  • WHO: Identify the organizational structure needed, teams, staffing and funding. Also identify the leadership and governance method.
  • WHEN: List any iterations involved. Publish a schedule and indicate the priority of each step
  • HOW: List the current and future states and identify the gaps. Spell out any changes to the business, decision, analytic and information processing processes. Be sure to include all costs and risks.

DON'T's – Five Fatal Strategic Flaws

  • There are more than five.
  • If you build it, they will come.
  • Governance is so yesterday
  • Data Quality problems? What Data Quality problems?
  • "Just give me a dashboard"
  • Managers want to dance with their own data, and not share (lack of visibility)
  • "Our Business Applications Vendor will supply it"
  • We can outsource the whole thing.

Recommendations:

  • Identify the Business Need
  • Recruit a sponsor
  • Identify the problems with not having a strategy
  • Develop the strategy with a cross-functional team
  • Scope, scale and iterate the solution

Guest Keynote: The Signal and The Noise

Nate Silver is an author and pollster who runs 538.com. He has received some notoriety due to his very accurate predictions on the 2012 Presidential election. His discussion centered on the analysis process, drawing conclusions and the implications of your conclusions. Key points:

  • Many scholarly research findings are actually false – follow-up researchers are not able to replicate the findings. False positives are common – be on the lookout
  • Why is Big Data not providing Big Progress?
    • More data means more ideas and possibly more conflict – see political divisions today after growth of the internet and cable TV, and wars of earlier centuries after the printing press was invented
    • People pick debate sides based on the information consumed. See the political divisions that watch MSNBC, CNN, and Fox News
    • Data can be cherry picked, and produces even more bias
    • Are the computer analysis results features or bugs? It's hard to know.

Steps to accurate analysis:

  • Think probabilistically – include the margin of error
  • Know where you are coming from – know your blind spots, and flaws and biases in your data
  • Think, and errFor Businesses, the 80/20 rule of analysis applies. The first 80% is easy. Using the last 20% effectively is the competitive advantage.
  • And always remember to validate your model.

Next up was Serving Up Self-Service BI with Carlie Idoine. I attended an earlier discussion on Self-Service BI, but that was more of an overview. This was more of a how-to, and actually was related to a blog I wrote a few months ago that can be found here.

  • Self Service BI should be complementary to existing BI, and not a replacement.
  • Self Service BI consists of end users designing and deploying their own reports with an approved and supported architecture and tools portfolio.
  • Self Service BI is NOT:
    • A fix for all BI woes
    • One-size-fits-all
    • A replacement for existing BI
    • A shiny new tool
  • Steps to implementation:
  • Identify new sources of data
  • Identify new types of data
  • Build a logical data model
  • Establish a common metadata model
  • Create a data management and data quality standard
  • Identify the data stores
    • RDBMS
    • OLAP
    • Data Discovery (in memory, columnar, etc.)
    • Flexible provisioning (expand and contract the data, new technology evaluations)
  • Delivery and Presentation
  • Leverage existing reports
  • Extend the tool functionality (mobile, data discovery, SaaS, etc.)
  • Maintain a library of standard components
  • Templates
  • Helps with validation of reports
  • Monitored and controlled artifacts
  • Share your results and apps
  • Know your uses
  • Assess current state
  • Consider the toolset
  • Build a strong information management infrastructure
  • Define appropriate and inappropriate uses
  • Leverage Self-Service models and analysis to establish an analytic culture
  • Collaboration
  • Be realistic
  • Evaluate the environment
  • Educate
  • Develop and plan your approach
  • Recommendations:

Last was a review of Five Best Practices and Five Common Mistakes in Real-Time Analytics with Roy Schulte. His key points were:

Best Practices:

  • Tailor information to the role, individual and decision
  • Use a layered approach to information presentation (time and attention are scare resources)
  • Enable action, not just insight
  • Automate the decision and response where possible
  • Expand from single stove-pipe KPI's to panoramic visibility

Common Mistakes:

  • Reporting too many things (extraneous metrics)
  • Cluttered visual displays
  • Deploying systems without guard rails (systems that can make disastrous recommendations)
  • Relying on fixed thresholds for comparisons and results (ex: comparing call volume on regular days vs. holidays)
  • Overwhelming users alert fatigue

Stay tuned for a wrap-up and my thoughts on the common themes that ran through the conference coming soon!