miércoles, 22 de diciembre de 2010

Capitulo 5. Diseño DWH - El Modelado Dimensional (FK-MODI)


En este capitulo veremos como se crea el documento denominado "Modelado Dimensional" dentro de la etapa 5 (Diseño y arquitectura del DWH). El desarrollo de este documento servir? para representar los requerimientos de los usuarios desde el punto de vista de un modelo dimensional y a su vez sirve como entrada para desarrollar el m?delo lógico de la base de datos (DM o DWH).


http://blip.tv/file/get/Falconeris-Capitulo5DiseoDWHElModeladoDimensionalFKMODI590.mp4

martes, 21 de diciembre de 2010

Five Simple Steps to Better Decisions


Business processes seem to come in two flavors: those that produce transactions or content and those that produce decisions. The quality of decisions from the latter category often drives the trajectory of the business. Well-executed, insightful decisions can lead to superior results.




1. Focus on Processes that Matter to Your Business

Organizations improving insightful decision-making carefully pick the key processes and operational variables on which to focus. Clear alignment exists between a successful organization's market strategy and its processes and operating metrics to implement the strategy. W. Chan Kim and Renee Mauborgne, in their groundbreaking book, "Blue Ocean Strategy," developed an interesting approach in which they recommend picking operational variables in the context of strategy development. They point to well-known examples of companies with clearly differentiated strategies, including Southwest Airlines and Cirque du Soleil.

In my own work, particularly in the high tech electronics industry, I have seen clients select variables that include forecast accuracy, order fulfillment rate and inventory levels. Making planning decisions on a weekly basis at the SKU level based on insights across those variables resulted in tremendous improvements in all three.



2. Stay Focused on Your End Goal

The improvements you target should be expressed as changes in the specific selected variables. For example, if the goal is to reduce inventory by 30 percent, the initiative should fit that objective clearly. This kind of Deming approach of "you get what you measure" is well-documented, but it is surprising how many organizations do the first step without then taking the time to set clear objectives in the second. Deriving insights from a business process requires a good balance of freedom to efficiently explore information and decision alternatives coupled with a clear idea of the objective.



3. Ensure Your Data Supports Your Insights

Taking into account the processes, variables and objectives selected in the first two steps, the third step in improving decisions is to determine the readiness of your data and infrastructure to support the kind of insights required. Organizations often get caught in the trap of believing that their data or infrastructure are not up to the task and assuming that progress cannot be made without solving those issues. And yet, decisions must still be made, and it falls on business analysts to cobble together information manually and come to meetings armed with spreadsheets. These discussions based on suspect data often lead to finger pointing and fact questioning instead of insightful decisions. My observation is that if the data is suitable to drive these required, but often ineffective, discussions, would it not make more sense to leverage the data in a smarter way to derive insights more systematically and in a way that improves over time?



One of these techniques is to provide analytic reports showing all variations of a particular data field along with their owners. The process designers indicate which field variant is authoritative for a particular value, and technology can be used to manage the communication with other owners as they align their data. As alignment is achieved, the quality of the insights steadily improves. This "peer pressure" approach to data cleansing at the source is reminiscent of rating systems used for sites like eBay. There is incentive to getting the information right at its sources because everyone sees the impacts of good and bad data downstream.

This technique is distinct from the traditional approach of creating large data warehouses that attempt to consolidate schemas and provide highly cleansed enterprise data from a central source for driving analyses and processes. Many organizations have struggled with the data warehouse approach, in part because their businesses don't remain static long enough to even finish the warehousing project. In my own work, I typically leverage warehouses that are in place, at whatever level of completion, and then use the peer pressure approach to fill in the gaps and address new gaps as they emerge.



4. Parlay Processes & Insights into Smarter Decisions

The fourth step is to design and engineer the process and business analytic capabilities required to produce the insights and execute the resulting decisions. This work might seem straightforward, but it is fraught with subtleties and traps typically resulting from experience biases among the team involved in the work. For example, an IT team charged with deploying a company's business intelligence technology of choice would naturally focus on the reports required. The reports are a critical part of deliverables, but if the business analysts still have to manually transform the information and engage in offline or disconnected interpretation discussions spanning a company's functions, driving insightful decisions remains difficult.

Alternatively, if the team is adept at business software development that supports transaction or content production processes, the tendency is to try to develop analytical processes using the same methodology. This typically results in elongated development cycles and solutions that still miss the mark. Improving insight requires a careful combination of flexibility and context management in some kind of guided analytics environment, as opposed to an exact, step-by-step approach.

If the team comes from a process development or consulting background, the two traps I see most are biasing the work more toward the process than the result and producing one-time deliverables that may not transition well into an ongoing change vehicle.

While many of the skills of these teams are often highly valuable, agile development processes coupled with the right amount of business-focused domain expertise are more suitable for business analytics. Getting capabilities in the hands of the process stakeholders quickly and then letting them evolve as the methods of gaining insights emerge usually adds more value quicker than locking down exact requirements and following traditional development methods. And it is equally important to ensure that the resulting process captures the entire insight loop, including planning, reporting, analysis, collaboration, decision-making and execution.




5. Use Your New Processes to Drive Improvements

The fifth step is to operate the new process and drive the targeted improvements. Here, it is important to make sure resources are provided for properly interacting with the process, data and stakeholders to facilitate the emergence of insights and decisions. Initially, the new process might require more work than the old process, especially until the stakeholders get comfortable with the differences in the new decisions versus what they would have done in the past. This initial increase in work should be planned for, and if your program is successful you should soon see a much sharper net decrease in work versus the old process. The best insight-driven processes eventually require tremendous effort to stop, as opposed to tremendous efforts to keep them running.

This five-step approach blends a number of disciplines, and most organizations will not have all of the skills or required technologies readily available. You don't have to go it alone or wait to get started until your team is fully in place, though. Business models and other resources are emerging quickly to help organizations holistically with these kinds of programs and will be of tremendous value as you develop your program. As you move through the steps, you can define the gap between the resources you have and those that are needed and build your case around the target variable improvements that process insights will incrementally deliver to your business.

viernes, 17 de diciembre de 2010

SaaS BI Tools: Better Decision Making for the Rest of Us


Simple business decisions, each of which impacts a company's performance and efficiency, are made every day, at every level of an organization, by workers in every department. But conventional business intelligence (BI) tools are often not available to most decision makers and are typically designed for use only by trained business analysts. Software as a service (SaaS)–based BI tools are designed to help the millions of people in non-IT "lines of business" (LOBs) who struggle every day with the task of mining Microsoft Excel spreadsheets and other unstructured data sources when performing everyday tasks such as making sales forecasts, planning for resource utilization, or servicing customer accounts. Especially in this time of limited budgets and uncertain futures, inexpensive, easy-to-deploy SaaS BI can help companies put easy-to-use data mining and reporting tools for smart decision making into the hands of more employees and uncover the real "geniuses" of decision making hidden in every department.


Benefits of SaaS BI

BI offerings delivered via the cloud provide tremendous additional benefits of scale and efficiency,lower cost, and better consumption of cloud and local data sources, and they are changing the way businesses license, deploy, and utilize BI to support decisions at their companies. Some benefits of SaaS BI are as follows:


Access by more employees to more data. Key beneficiaries of the trend toward SaaS BI have been the millions of people in non-IT lines of business who struggle every day with the task of mining Excel spreadsheets and other unstructured data sources when performing everyday tasks such as making sales forecasts, planning for resource utilization, or servicing customer accounts. Users of LOB applications produce the production data that drives BI requirements, and the powerful BI reporting and analysis capabilities are especially impactful in the hands of the users who created the data, resulting in greater adoption and utilization. Every business can be more efficient by putting better reporting and analysis tools into the hands of the LOB and departmental employees who are the subject matter experts in their domains. SaaS BI can make their jobs easier by providing browser-based access to sophisticated but easy-to-use data mining and reporting tools and uncovering the "geniuses" of decision making hidden in every department.

Business optimization for hard times. SaaS-based analytics can help companies be more resourceful in volatile times by helping them identify cost savings, efficiencies, and opportunities for process improvement they may have otherwise "missed in the data."


Faster "time to value" for a quicker return on investment. Implementations of SaaS BI solutions can be far faster and less expensive than implementations of conventional solutions. Consider that building a traditional BI solution with a data warehouse implementation, data normalization, and data marts for data staging by query systems typically requires between 6 and18 months, sometimes longer. By contrast, SaaS BI deployments typically require 2 to 4 months, and SaaS vendors cannot book revenue until the implementation is complete — a situation in which both buyer and seller are equally incented to decrease what some vendors call "time to value."

Streamlined architecture, with zero infrastructure. Unlike on-premises BI systems, SaaSbased BI is hosted by a vendor. Users access the various modules (for example, analysis,reporting) securely via any Web browser. From a systems architecture standpoint, this method is optimal because it does not impose an ongoing computing burden on back-office production systems, and because the application is hosted by the SaaS provider, users do not need to maintain an onsite data warehouse. Users conduct their secure sessions via a Web browser, so there is no client software to install, and users are always assured of running the most recent, optimized version of the application code because SaaS applications are not "rolled out" like conventional applications; they are simply upgraded and optimized on an ongoing basis.



Ability to tap operating expense (opex) budgets versus capital expense (capex) budgets. Because SaaS solutions are licenses as subscriptions, their license cost is a monthly, predictable expense and does not require a one-time up-front payment for licenses as conventional software. Further, the ongoing support costs to run associated hardware, management, and integration tools and middleware and hire and train staff members to support on-premises applications are substantial, and nonmaintenance support costs are typically booked as capex. Because these budgets will be flat in 2010–2011, SaaS solutions give users a chance to get access to BI and analytics tools much faster, using opex funds that might reside in their LOB budgets.

Better alignment of business goals. Business units consuming IT resources sometimes feel discordance between the technology they know they need to have to produce good business outcomes and the tools their IT staff has the skills and bandwidth to deploy. But IT is typically a cost center, and its priorities don't always align with LOB requirements. SaaS-delivered BI helps business units get business done and helps align the goals of the business unit with its technology tools.

miércoles, 8 de diciembre de 2010

Worst Practice #1: Assuming the Average Business User Has the Know-How or Time to Use BI Tools


Too Much for Too Few
BI report design, ad hoc query, and OLAP analysis tools have hundreds, if not thousands, of features. Although the user interface is often simple, complexity is introduced from the data side. Even a simple data warehouse has hundreds of columns of data, and it’s not uncommon for more complex systems to have thousands of columns. When an end user is faced with a blank canvas, thousands of columns of data, and hundreds of accessible features, complexity is automatic. “Where do I begin?” is often the first question, shortly followed by “I don't have time for this,” or “I give up.”

The user skill pyramid is a widely discussed and generally agreed upon description of the end users in most organizations. The simple version of the pyramid shown below demonstrates that 90 percent of the users within most organizations fit into the class of users known as non-technical business users, which means that only 10 percent of users are advanced enough to use a BI tool.


What may not be obvious from the pyramid is that most executives and managers, often the primary strategic decision-makers, are in the lower portion of the pyramid – that is the non-technical users.


It’s a Matter of Time
In some instances executives and managers are technical enough to use a BI tool, but they don’t have the time to work with a BI tool and navigate a data warehouse to produce the information they need. Most people need a faster, easier way to get the information they need than that provided by a BI tool.


BI Go-To Guys and Multiple Versions of the Truth
In some cases, moderately successful deployments of BI tools are found in individual departments. Usually that means that each department has identified and relies on a handful of advanced users who become the tool experts, or the “BI go-to guys.” These users employ the BI tool on the behalf of others, and create and distribute information for their department. In these cases, another issue is brought to the surface – the inconsistency of the answers generated by more than one
advanced user, also known as multiple versions of the truth.

Multiple versions of the truth result when two or more people apply different query methods and functions, and arrive at different conclusions. The challenge is that it’s difficult to know which, if any, conclusion is correct.

The tool-based efforts of advanced BI users do not go through the same rigorous quality testing of an IT department. Their work within a tool is typically not auditable. When this occurs, the validity of the information system, the BI tool, and the data warehouse are all brought into question. Valid or not, many companies have more confidence in operational reports generated, and tested by IT professionals. Many become skeptical of pure ad hoc information created with a BI tool because of the potential for variations and inconsistencies.

The Solution
Organizations need BI solutions that are easy to use for the entire user population, especially those in the bottom portion of the usability pyramid. In addition, they need a solution that mitigates multiple versions of the truth by providing access to a common source of enterprise information and standardized report generation methods. A BI platform is the answer to all of these requirements.

A BI platform leverages BI tools along with other technologies, including databases, data integration, and portals to provide an end-to-end solution for a defined business problem or set of business problems that can be termed a BI application. While BI platforms are implemented by IT professionals, their end result, the BI application, is designed for business users.

Organizations have been led to believe that BI platforms are too complex for their needs. This couldn’t be further from the truth. When you consider the data integration, warehousing, and end-user training costs associated with BI tools, a BI application built on a BI platform has about the same time to market as a BI tool. And end users embrace easy-to-use BI applications as part of their day-to-day routine, which is arguably the most critical success factor of any application.
This is why BI platforms have far greater success than BI tools.

The fact is that most non-technical business users can and will access information through BI applications, which are much simpler to use than BI tools. BI applications leverage reporting technology, Web browsers, and e-mail to make information more accessible to these business users in a comfortable, easy-to-use environment.

For example, today’s parameter-driven BI applications provide users a simple Web interface to navigate to the report they want, much the same way they would find an item on eBay or a book on Amazon. BI applications allow users to easily customize the report by selecting options from pull-down menus the same way they would fill in their address and select their home state or a shipping option from a drop-down list.

lunes, 6 de diciembre de 2010

Users Trends (Business analysts)

On the other side of the equation, power users require MAD capabilities 20 to 40% of the time. The bulk of their time is spent using tools designed to handle a variety of analytical tasks, including report authoring tools, spreadsheet-based modeling tools, sophisticated OLAP and visual design tools, and predictive modeling and data mining tools.

Times have never been better for power users. Their desktop computers contain more processing power and can hold more data than ever before. Today, there are more tools designed to help power users exploit these computing resources to analyze information. Many cost less than $1,000 for a single user or can be downloaded from the Internet. “Power users have more power today than ever to perform deep analytics,” .

Despite the plentiful options, many power users are bereft of optimal analytical tools. Either they restrict themselves to spreadsheets and desktop databases, or that’s all their organization will give them. Most homeowners wouldn’t hire a carpenter with just one or two tools in his toolbox; they want a carpenter whose toolbox contains tools for every type of carpentry task imaginable. In the same way, organizations need to empower power users with a multitude of tools and technologies to make them more productive as analysts. If implemented correctly, the technology can liberate analysts to gather, analyze, and present data quickly and efficiently without undermining enterprise IT standards governing data, semantics, and tools.

Four types. Power users are a diverse group who perform a variety of analytical tasks. I’ve divided power users into four types:

1. Business analysts. Data- and process-savvy business users who use data to identify trends, solve problems, and devise plans.

2. Super users. Technically savvy departmental business users who create ad hoc reports on behalf of their colleagues.

3. Analytical modelers. Business analysts who create statistical and data mining models that quantify relationships and can be used to predict future behavior or conditions.

4. IT report developers. IT developers, analysts, or administrators who create complex reports and train and support super users.

According to our survey, most organizations have all four types of power users, although only 51% have analytical modelers.

 

image

 

BUSINESS ANALYSTS. Business analysts sit at the intersection of data, process, and strategy, and they play a significant role in helping the business solve problems, devise plans, and exploit opportunities. Their titles include “business analyst,” “financial analyst,” “marketing specialist,” and “operations research analyst.” Executives view them as critical advisors who keep them grounded in reality (data) and help them bolster arguments for courses of action.

Business analysts perform three major tasks:
1. Gather data. Analysts explore the characteristics of various data sets, extract desired data, and transform the extracted data into a standard format for analysis.

2. Analyze data. Analysts examine data sets in an iterative fashion—essentially “playing with the data”—to identify trends or root causes. Analysts will visualize, aggregate, filter, sort, rank,
calculate, drill, pivot, model, and add or delete columns, among other things.

3. Present data. Analysts deliver the results of their analysis to others in a standard format, such as a report, presentation, spreadsheet, PDF document, or dashboard.


Today, business analysts spend an inordinate amount of time on steps 1 and 3 and not enough time on step 2, which is what they were hired to do. Unfortunately, due to the sorry state of data in most organizations, they have become human data warehouses. TDWI estimates that business analysts spend an average of two days every week gathering and formatting data instead of analyzing it, costing organizations an average of $780,000 a year.

According some survey, most business analysts use spreadsheets to access, analyze, and present data, followed by BI reporting and analysis tools. However, in most cases, the analysts use BI tools as glorified extract tools to grab data warehouse data and dump it into a spreadsheet or desktop database, where they normalize the data and then analyze it. The next most popular tool is SQL, which analysts use to access operational and other sources so they can dump the data into spreadsheets or desktop databases (which rank number five on the list, following OLAP tools).

 

image

 

To improve the productivity and effectiveness of business analysts, organizations should continue to expand the breadth and depth of their data warehouses, which will reduce the number of data sources that analysts need to access directly. They should also equip analysts with better analytical tools that operate the way they do. These types of tools include speed-of-thought analysis (i.e., subsecond responses to all actions) and better visualizations to spot outliers and trends more quickly.

lunes, 25 de octubre de 2010

Traditional BI Systems A re N ot Des igned For A gility


The traditional approach to business intelligence (BI) has reached its limits. Over the past 20 years we have developed a set of procedures and technologies that allow us to aggregate,
cleanse, prepare, and report on enterprise data. While there have been incremental improvements in efficiency and speed over that time, the fundamental approach to BI has not changed much. During that same period, however, businesses and their information needs have changed dramatically. Enterprises have transformed themselves from rather isolated entities, often with an inward focus on operational efficiency, to players in geographically dispersed, multi-partied ecosystems that need to have as much understanding of their customer’s interests and partners’ operations as they do their own operational efficiency. The information they collect, transport, and analyze has changed accordingly and, more importantly, so has the speed with which they need to act.

Today agility is the goal of most organizations regardless of their industry and it is a top goal of business leaders whether they are in sales, marketing, manufacturing, engineering, customer care, or service delivery. Businesses are competing not just on efficiency, but on their ability to sense market conditions and quickly respond. That is exactly where BI and the business needs have diverged.

In many organizations, BI systems are used almost exclusively to generate standard monthly or quarterly reports. These reports deliver great value – certainly most organizations could not survive without them. But traditional BI systems require an expensive and time-consuming process to identify all the answers the business users ultimately want, build a data model that captures that information and unify data from disparate sources into that data model. Often, multiple cycles of this process are required before the needs of the business users are fully addressed. More than anything, a traditional BI approach is designed to structure data, ensure consistency across systems, and efficiently handle large amounts of structured data – goals that seemingly run counter to the need for agility.

jueves, 7 de octubre de 2010

Q&A on SaaS BI Technology


SaaS is a service model that sits on top of cloud computing environments and is designed to leverage the power of the cloud infrastructure. The software applications are accessed via a client such as a Web browser. They are managed and maintained by the SaaS vendor company removing much of the administration costs generally associated with on-premises software solutions.

SaaS and cloud computing are dependent on one another to bring their unique value to the market.


What types of projects are most often addressed with SaaS BI? What BI projects aren't good candidates for this technology?

The types of projects best suited for SaaS business intelligence are evolving quickly. Several years ago, the most basic functions of BI were the only ones best suited for the SaaS environment. Recently, SaaS vendors have taken great strides to deliver sophisticated feature sets and are now branching off into corporate performance management suites and even on-demand predictive analytics. Reporting and analysis still lead the way as the most heavily adopted feature sets within SaaS solutions, but as the technology continues to evolve, so will the demands of the SaaS customers.

Because of the obvious data integration challenges presented by SaaS Bi applications, real-time data is still difficult to leverage in a SaaS model. Leading integration vendors are starting to deliver new solutions that have greatly reduce data access time. As these innovations continue, real-time business intelligence will become easier.


One of the benefits SaaS promises is that it reduces an enterprise's IT costs. Is SaaS really less costly than traditional solutions?

In many cases, SaaS BI is less expensive than traditional on-premises software. Recent research by Enterprise Management Associates (EMA) shows that 76 percent of the organizations implementing SaaS have realized significant ROI. Upfront capital costs of hardware, along with the less expensive operating costs of SaaS, have contributed to these savings. There are exceptions, especially when the number of user licenses is high. Many companies have moved to a total cost of ownership (TCO) model when analyzing the impact of SaaS versus on-premises solutions. Many variables are involved in these scenarios and determine which direction is financially best for a company.


What other benefits can BI professionals expect?

There are many benefits to SaaS BI from a monetary standpoint. Both the capital and operating expenses of IT are reduced for most SaaS implementations. This upfront cost savings can translate into reduced risk for companies that are trying to experiment or deliver proof-of-concept (POC) projects. The elastic qualities of SaaS are a perfect fit for companies that experience seasonal highs and lows, as they don’t need to purchase additional hardware to handle the changes. Some studies have shown that SaaS BI solutions have fostered greater adoption among business users, adding to a more diverse business intelligence community.


You've discussed some compelling benefits. Are there any drawbacks to SaaS? Given that data is now "off site," isn't security a problem?

Security has always been a hurdle for SaaS. Customers are vigilant about securing their data as well as governance, regulatory, and compliance issues that surround it. The vendor community recognized this challenge early on and has addressed it on two fronts. The first is third-party auditing and certification. Leading SaaS BI vendors are SAS-70 certified and some have also attained Systrust certification. Both certifications relate to how the company controls client data and the systems and processes surrounding their data infrastructure.

The second front is innovation around data integration. Many of the vendors are finding ways to keep the data where it is while still leveraging the power of SaaS and cloud computing platforms. Data virtualization firms have also entered the market, providing trusted data federated environments that reduce the security risks of off-premise systems. In the end, the most reliable security feature for SaaS vendors is their own desire to prosper. EMA research has shown that 83 percent of the respondents would be unlikely to work with a firm that has had a security event. SaaS vendors know this and take every precaution to secure customer data and the applications they run on.


Is SaaS a real alternative for enterprise-sized companies? If so, what are the challenges to success?

Small to mid-size companies make up the largest portion of SaaS BI customers. Enterprise-sized companies have been slower to adopt SaaS BI. Large companies are often not early adopters and take a wait-and-see perspective. This, coupled with the need for a greater level of sophistication around feature sets, made it difficult for these firms to see SaaS as a viable alternative to on premise solutions early on. Enterprise-level successes such as SalesForce.com have shown large firms that SaaS can fit their needs and expectations, causing them to look hard at what the SaaS BI market has to offer. There is now a thriving community of firms big and small delivering SaaS BI to enterprise clients such as AON Insurance, ACNeilson, DHL, and Citrix. Both pure-play upstarts and the biggest BI vendors in the world are now offering on-demand SaaS BI products and applications.


What best practices can you offer to overcome these challenges and for successfully managing a SaaS BI project?

Companies looking into SaaS BI solutions need to consider many things, including security, licensing, TCO, feature sets, training, and service management. I highly recommend that before jumping into the deep end of the pool, a customer should scope out a proof-of-concept project that closely aligns with their overall business intelligence needs.

Like any BI project stakeholders, IT staff and the line-of-business users must take part in the process. It’s important that the customer understands both the flexibility and the restrictions of a SaaS application. Customization in a SaaS environment can be costly or impractical. Support and service are also important in a SaaS relationship. Service events should be built into the POC to assure that the needs of the clients are met.

lunes, 27 de septiembre de 2010

Designing a Metrics Dashboard for the Sales Organization

The primary objective of the dashboard creation process is to identify and implement key performance measures and indicators that will enable managers to quickly and effectively manage the sales organization. This can be accomplished through selecting metrics that support sales objectives, strategy and goals. Some of the benefits that will result from implementing the dashboard include:


• Gain a deeper understanding of the drivers of sales productivity
• Identify where management action is required to improve sales productivity and effectiveness
• Develop a common vehicle for monitoring and improving performance
• Understand sales performance from a variety of perspectives
• Build consensus on key performance measures and drivers
• Clarify accountability around specific measures
• Enable performance benchmarking with competitors and best-in-class companies


Approach
Corporate vision guides the development of an organization’s sales objectives, strategy and tactical goals. Metrics are in turn driven by sales strategy and goals. At the tactical level, metrics serve as the primary vehicle for managing performance within the organization. Targets are set for each metric, performance is monitored and interpreted to provide timely feedback and corrective actions are initiated.

But which metrics should we choose? The sheer abundance of metrics creates a situation in which it may be difficult to properly identify metrics that make the most sense. In answering this question, the first step is to create a framework in which all the available metrics may be organized and prioritized. This framework consists in two dimensions; first, a corporate perspectives dimension and secondly a sales performance dimension. The corporate approach takes a 360 degree view of the organization from five distinct perspectives: customers, employees, partners, investors and internal processes. This approach is typically utilized in the so called “Balanced Scorecard” approach.

Each of the corporate perspectives should be examined and appropriate individuals identified to provide a list of metrics. In addition to the corporate perspective, a sales performance dimension must also be included. This breaks sales performance into four elements: readiness, productivity, efficiency and effectiveness.

Each of the corporate perspectives should be examined and appropriate individuals identified to provide a list of metrics. In addition to the corporate perspective, a sales performance dimension must also be included. This breaks sales performance into four elements: readiness, productivity, efficiency and effectiveness.


The key to the metrics identification process consists in both fact-finding and identifying metrics as well as categorizing metrics according to the above two dimensions, corporate perspective and sales performance. This basically involves the creation of a matrix with these two axes which then may be populated with metrics collected through the fact-finding process.


Dashboard Design Process
The dashboard design process consists in metric selection, design and implementation. Each of these steps involve some basic principles outlined below.

Metric Selection

• Supports stated objectives, strategies and goals
• Can be directly impacted by sales management
• Can be measured in a cost effective and timely fashion
• Reflects one of the four key dimensions of sales performance (readiness, productivity, efficiency and effectiveness)
• Enables performance benchmarking with industry competitors and best-in-class companies


Dashboard Design Principles

• Reflects senior management priorities
• Balances internal and external metrics
• Includes measures of past performance and indicators of future performance
• Minimizes the number of metrics in order to facilitate management interpretation

The actual design process is outlined below along with the detailed steps involved.

1. Metric Selection

Identify existing and potential metrics by corporate performance perspective (interview process)
• Categorize metrics into four dimensions of sales performance (efficiency, effectiveness, productivity and readiness) and eliminate unclassifiable metrics
• Create preliminary scorecard matrix that combines business perspectives with sales performance dimensions
• Review scorecard matrix for completeness and add metrics based on experience


2. Dashboard Design

• Eliminate metrics that cannot be measured or are too costly to measure
• Eliminate metrics that cannot be significantly impacted by sales management
• Prioritize metrics based on alignment with stated strategy and goals
• Select top metric per cell in scorecard matrix based on alternative approaches
• Evaluate alternative scorecards and select most appropriate metrics


3. Implementation

• Assign metric accountability
• Determine performance targets
• Obtain available benchmark data
• Determine monitoring, interpretation and feedback procedures and guidelines
• Develop corrective action review process


Metrics Matrix
Design To facilitate the dashboard design process, a matrix tool may be created to help classify the various metrics uncovered in the fact finding process. Because each metric can be understood in terms of sales performance as well as a business perspective, a metrics matrix can be created that combines the business perspectives along the horizontal axis with sales performance dimensions along the vertical axis. Each metric is placed in the matrix based on its most appropriate classification with respect to these dimensions. This tool has the following benefits:

• Creates a framework around the metrics selection process
• Balances business perspectives and sales performance views
• Provides a systematic approach
• Facilitates prioritization
• Allows identification of particular areas of emphasis
• Highlights areas with no metric coverage


Criteria for Eliminating Metrics
Eliminate metrics that cannot be measured or would be too costly to measure
• Partner coverage
• Amount of effort exerted on business approvals

Eliminate metrics that cannot be directly impacted by the sales organization
• Customer’s growth rates
• Customer profitability
• Partner satisfaction
• Number of deals involving per partner
• Share of partner revenue by platform
• Partner’s profit margin
• Partner churn
• Rate of technology transfer
• Number of certified consultants
• Number of certified partners

Prioritization Decision Rules
Each cell in the metrics matrix may contain many metrics and, as a result, must be prioritized. Some basic rules to follow in that process are as follows:

• Alignment with stated strategy and goals – Use metrics that align with strategy or show alignment with strategy the organization
• Frequency and intensity of emphasis during fact-finding – Use metrics that different corporate perspectives emphasize
• Experience – Use metrics that experience shows are important to measure
• Availability of benchmark data – Use metrics for which benchmarks exist


Preliminary Dashboard
After the completion of the matrix a preliminary matrix may be created that graphically represents the top metrics from each cell. Feedback from management can help determine additional changes or alternative metrics that are required.

Implementation Steps
After agreement on dashboard design, the implementation process may begin. Effective dashboards require live data feeds and, hence, the data integration process may be complex because of multiple data sources. Here is a list of the steps involved in implementation.

• Select final dashboard metrics
• Identify data sources
• Assess feasibility
• Assign metric accountability
• Develop action plan
• Create timeline
• Populate initial metrics
• Establish internal and external benchmarks
• Determine targets
• Determine monitoring, interpretation, feedback procedures and guidelines
• Develop corrective action review process

Best practice allows for online dashboards that may be customized to a users needs. For example, the matrix tool described above might be provided online and the user could select from these metrics those they were interested in and build up there own dashboard. In addition, each user will want the ability to drill down to a level in the organization that is relevant to their position (i.e. a district manager wants to see his district data).

In conclusion, the dashboard design process is detailed and requires thorough research. In addition, data integration and online application development are critical. However, the benefits of an effective dashboard far outweigh the costs in allowing management the critical measures necessary to guide the organization toward success.

Falconeris Marimon Caneda
Socio Director
TTS Consulting

viernes, 17 de septiembre de 2010

The Agile Data Warehouse: Keeping Users Happy


Though they share a single word, agile data warehousing (DW) is nothing like agile software development.

Agile programming disciplines tend to champion a code-first, document-later ethic. Some agile approaches even eschew traditional documentation altogether. Agile programming techniques tend to place an emphasis on frequent testing: at least one agile discipline, test-driven development (TDD), explicitly prescribes a test-first approach.

In all of their variants, agile approaches emphasize the importance of frequent (and typically interactive) involvement with line-of-business customers. It isn't unusual for agile teams to solicit feedback from customers on a periodic (daily, weekly, or bi-weekly) basis. This lets them incorporate new features as customers demand them -- or change features based on feedback from users.

There are a number of reasons why a straight-up agile approach doesn't translate very well into the data warehousing world, experts say.

There's the important paradigmatic distinction between programming -- with its procedural (or line-by-line) orientation -- and data management (DM), which typically lives and thinks in a set-based world.

There are practical logistical concerns, too. "You have to look at it kind of differently, because it can take you longer to write a test case than it takes us to generate the code for you. Suddenly, you're in a different paradigm.

When you're building warehouses in an agile fashion, you're bringing together the concepts of software development and data, and a lot of the agile software techniques don't flow across to the data world."

A lot of the agile buzz at last month's TDWI World Conference in San Diego concerned agile business intelligence (BI), which, Whitehead respectfully suggests, isn't at all the same thing as agile data warehousing.

"When people talk agile in the data world, they generally talk agile BI. They generally talk about the reports, that sort of layer becoming agile. That's a no-brainer. If it's a distinct point where you have customer interaction, of course you should put something in front of them. It isn't quite so easy with a data warehouse," he argues.

All the same, Whitehead describes himself as a proponent of agile data warehousing, particularly inasmuch as "agility" connotes the acceleration or automation of tedious, onerous, time-consuming, or otherwise costly tasks.

Agility is, of course, synonymous with nimbleness, deftness -- that is, speed.

Finally, The essence of agile: "If you're a data guy, you need to make sure that you are doing whatever you can to deliver quickly and deliver value and make changes so that your stuff is relevant, If you can't do that, people are going to fill that vacuum."

viernes, 27 de agosto de 2010

Stage or Not to Stage in Data Warehouse


The back room area of the data warehouse has frequently been called the staging area. Staging in this context means writing to disk and, at a minimum, I recommend staging data at the four major checkpoints of the ETL data flow. But, the main cuestion in this note is: When i need to design stage area in my data warehouse project?


To Stage or Not to Stage

The decision to store data in a physical staging area versus processing it in memory is ultimately the choice of the ETL architect. The ability to develop efficient ETL processes is partly dependent on being able to determine the right balance between physical input and output (I/O) and in-memory processing.

The challenge of achieving this delicate balance between writing data to staging tables and keeping it in memory during the ETL process is a task that must be reckoned with in order to create optimal processes. The issue with determining whether to stage your data or not depends on two conflicting objectives:


- Getting the data from the originating source to the ultimate target as
fast as possible

- Having the ability to recover from failure without restarting from the beginning of the process


The decision to stage data varies depending on your environment and business requirements. If you plan to do all of your ETL data processing in memory, keep in mind that every data warehouse, regardless of its architecture or environment, includes a staging area in some form or another.Consider the following reasons for staging data before it is loaded into the data warehouse:

- Recoverability. In most enterprise environments, it’s a good practice to stage the data as soon as it has been extracted fromthe source system and then again immediately after each of the major transformation steps, assuming that for a particular table the transformation steps are significant. These staging tables (in a database or file system) serve as recovery points. By implementing these tables, the process won’t have to intrude on the source system again if the transformations fail. Also, the process won’t have to transform the data again if the load process fails. When staging data purely for recovery purposes, the data should be stored in a sequential file on the file system rather than in a database. Staging for recoverability is especially important when extracting from operational systems that overwrite their own data.

- Backup. Quite often, massive volume prevents the data warehouse from being reliably backed up at the database level.We’ve witnessed catastrophes that might have been avoided if only the load files were saved, compressed, and archived. If your staging tables are on the file system, they can easily be compressed into a very small footprint and saved on your network. Then if you ever need to reload the data warehouse, you can simply uncompress the load files and reload them.

- Auditing. Many times the data lineage between the source and target is lost in the ETL code. When it comes time to audit the ETL process, having staged data makes auditing between different portions of the ETL processes much more straightforward because auditors (or programmers) can simply compare the original input file with the logical transformation rules against the output file. This staged data is especially useful when the source system overwrites its history. When questions about the integrity of the information in the data warehouse surface days or even weeks after an event has occurred, revealing the staged extract data from the period of time in question can restore the trustworthiness of the data warehouse.


Once you’ve decided to stage at least some of the data, you must settle on the appropriate architecture of your staging area. As is the case with any other database, if the data-staging area is not planned carefully, it will fail. Designing the data-staging area properly is more important than designing the usual applications because of the sheer volume the data-staging area accumulates (sometimes larger than the data warehouse itself).

jueves, 19 de agosto de 2010

What's Essential -- And What's Not -- In Big Data Analytics (Columnar Data Base?)


Far from arguing over the benefits (or drawbacks) of a column-based architecture, shops would be better advised to focus on other, potentially more important issues. Row- or column-based engines marketed by Aster Data, Dataupia, Greenplum Software Inc. (now an EMC Corp. property), Hewlett-Packard Co. (HP), InfoBright, Kognitio, Netezza, ParAccel, Sybase Inc. (now an SAP AG property), Teradata, Vertica, and other vendors (to say nothing of the specialty warehouse configurations marketed by IBM, Microsoft, and Oracle) are by definition architected for Big Analytics.

Analytic database vendors today compete on the basis of several options -- capabilities such as in-database analytics, support for non-traditional (typically non-SQL) query types, sophisticated workload management, and connectivity flexibility.

Every vendor has an option-laden sales pitch, of course -- but few (if any) stories are exactly the same. In-database analytics is particularly hot, according to Eckerson. All analytic database vendors say they support it (to a degree), but some -- such as Aster Data, Greenplum, and (more recently) Netezza, Teradata, and Vertica -- seem to support it "more" flexibly than others.

"With in-database analytics, scoring can execute automatically as new records enter the database rather than in a clumsy two-step process that involves exporting new records to another server and importing and inserting the scores into the appropriate records," he explains.

The twist comes by virtue of (growing) support for non-SQL analytic queries, chiefly in the form of the (increasingly ubiquitous) MapReduce algorithm. Aster Data and Greenplum have supported in-database MapReduce for two years; more recently, both Netezza and Teradata, along with IBM, have announced MapReduce moves. Last month, open source software (OSS) data integration (DI) player Talend announced support for Hadoop (an OSS implementation of MapReduce) in its enterprise DI product. Talend's MapReduce implementation can theoretically support in-database crunching in conjunction with Hadoop-compliant databases.

"[T]echniques like MapReduce make it possible for business analysts, rather than IT professionals, to custom-code database functions that run in a parallel environment," he writes. As implemented by Aster Data and Greenplum, for example, in-database MapReduce permits analysts or developers to write reusable functions in many languages (including the Big Five of Python, Java, C, C++, and Perl) and invoke them by means of SQL calls.

Such flexibility is a harbinger of things to come, according to Eckerson. "[A]s analytical tasks increase in complexity, developers will need to apply the appropriate tool for each task," he notes. "No longer will SQL be the only hammer in a developer's arsenal. With embedded functions, new analytical databases will accelerate the development and deployment of complex analytics against big data."

miércoles, 4 de agosto de 2010

An Agile BI Program

One of the biggest misconceptions about agile is that it is about getting more done faster. This is simply false. It is about delivering the right things of value, with a high degree of quality and in small iterations. The word "more" should be dropped. It is about avoiding waste or "mudda" when creating value. Have you ever delivered something that took a long time to build, only to have it never be used? Ask yourself why that was the case. This is the waste that we seek to avoid.

One of the biggest difficulties is to get the heads of business users and technical teams wrapped around thinking iteratively. Remember that delivering in small bits with communication built into the process is a foreign concept to many. Most people are equipped to deal with big bang and are unsure how to engage with a process that requires constant communication and participation. Others are simply afraid that once you deliver something you will never be seen again, so they ask for everything at requirements-gathering sessions

Delivering small has other benefits. We can avoid bottlenecks in the process by completing smaller chunks of work and by keeping all points in a process continuously busy as opposed to having too many wait states. This makes it is easier to test and demonstrate. The biggest benefit is that it gets "something" of value into production quicker, versus keeping valuable assets on the shelf in development. If value can be derived, get it into production as soon as your cycles allow. I purposefully refer to the outputs of BI development as assets, and they should be managed as such.

One of the biggest benefits is that when you fail, you fail fast. This is a good thing in that you demonstrate progress periodically and can ask your business users: "Is this what you wanted?" If it is not, you have wasted less development time that you would have under other methodologies (such as waterfall). However there are perception issues with this as failing is generally considered "bad." This is true only if you never learn from failures. If you incorporate continuous improvement ceremonies (such as regular start, stop, and continue sessions), this misperception can be mitigated.

Agile is well suited to data warehouse development because requirements are often difficult to gather for BI applications. This is the nature of BI, coupled with the fact that BI teams are often not properly staffed with dedicated business analysts. Such inherent challenges make adhering to the agile process beneficial. Prototyping, demonstrating, and communicating all help shape requirements over time by showing working models that can be used to illicit feedback.

A word of caution: No process will fully make up for poor requirements gathering.

Architect big and deliver small must be an overarching principle. It is one thing to deliver in small iterations, but you should have some idea of what your end state should look like at the program level. This obviously is the "architect big" part of the principle. The "deliver small" part comes from the agile cycles and data models are part of these cycles.

Be Prepared for a Journey

Agile is a process and like any process, it can have resistors. It takes time to hit your stride, so be patient. Agile takes time to implement. Having someone on your team that has been part of a successful agile development process will certainly help. If you do not have any experience, look for a coach who can help guide you through the process up at least get the team trained.

Either way, it is an amazing journey, and it is rewarding to watch a process mature and improve.

martes, 20 de julio de 2010

Mobile Business Intelligence Reporting

From a sociological perspective, users are becoming more comfortable with their phone’s ergonomics and multitude of features, and are using them as full-functioning mobile computers. Phones and laptops are becoming interchangeable. Initial evidence of this convergence is the large volume of e-mails sent from BlackBerrys and other mobileWindows-enabled smartphones, as well as the proliferation of CRM mobile applications. Also, phones have an advantage over laptops because they can be carried anywhere and used anytime – 24 hours a day, seven days a week. They don’t require mobile hot spots or other Internet connections and with Bluetooth they can be easily connected to printers and other peripherals making almost the entire office portable.






Mobile browsers now provide the same functionality of desktopWeb browsers so users get a consistent experience regardless of device. More people are searching theWeb, reading news, watching streamed TV, accessingWeb applications, and making transactions on their phone. this trend continues business is driven to evolve. Google, for example, recognized the increased use of mobile devices as a medium forWeb browsing and made its search tool and productivity applications (Google Apps) available on mobile phones, setting the benchmark for usability.

Smartphones are also forcing a shift in the paradigm of how information technology (IT) groups work. There are currently 1.5 billion phones in use around the world. By 2011 half of the world’s population will have mobile phones – 50 percent of which will be smartphones. This change clearly indicates that enterprises have to embrace smartphones as a primary form of communication. IT groups – for the first time in their history – have to adapt to consumer requirements instead of dictating their own agenda. If consumers can now access their Gmail on phones, why not access corporate apps too?


Improvements in Productivity

Economic gains from enabling mobile reporting are irrefutable. Currently one out of seven e-mail users is also a mobile e-mail user, having a BlackBerry or another smartphone. Early adopters,mainly executives, have seen measurable increases in productivity by being able to:

- Work during times otherwise wasted, such as while waiting at airports and before meetings
- Respond immediately to urgent messages
- Be avalable to and connected with other key decision-makers 24/7


Gains in productivity outweigh the expense of mobile devices and applications – an estimated fixed cost of $2,500 per mobile user. A low-cost mobile BI solution that does not require additional infrastructural investments such drives up the per-user return on investment (ROI). Furthermore, as mobile computing spreads through the ranks to all employees, the ROI increases exponentially.

According to Gartner analysts Steve Kleynhans, “Most IT organizations are ill prepared to deal with this new environment in which users drive technology.” IT groups are often (and in many cases justifiably) leery of new technologies. Knowing the difficulties inherent in implementing unproven solutions, many would prefer to wait for other companies to provide successful case studies with clear user benefits. Yet, waiting until this technology becomes mainstream means missing out on years of productivity gains.


Dashboards for Everyone



The sheer volume of information available, however, means users risk information overload. Dashboards have emerged as a concise way to visualize information. Instead of analyzing multiple reports and the relationships between them, a dashboard offers an analytical perspective. All relationships and associated measures are presented in a single, prepackaged view. The key obstacle to mass use of mobile dashboards is the small screen on the device as well as the requirement to be connected to the dashboard infrastructure. Two trends are changing this:

Better, larger screens with higher resolution are becoming popular, as on the iPhone, HP hybrid devices, and Nokia business phones. And, better browsers with advanced zoom functions, touch screen navigation, and interaction enhancers – such as zoom drop boxes for easier selection – display content in a useful way similar to dashboard displays.





Active Dashboards can be distributed to anyone – on any device – either via e-mail, via the My Mobile Favorites launch page or by posting them on theWeb, and users can interact with them online or offline.

lunes, 28 de junio de 2010

Agile BI (Business Intelligence) Basics

What is Agile BI?

Cindi Howson: The Agile Manifesto was first published in 2001 by a group of software engineers (see agilemanifesto.org) trying to improve the software development process and customer satisfaction. There are 12 principles, but the six that most apply to BI are:

• Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
• Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage.
• Business people and developers must work together daily throughout the project.
• The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.
• Simplicity -- the art of maximizing the amount of work not done -- is essential.
• The best architectures, requirements, and designs emerge from self-organizing teams.


Who is using agile development and how important is it?

I do get the sense that more innovative companies are using agile development, but I have also seen it in established manufacturing companies. It is less well suited to companies that have outsourced BI because it makes it harder to build things to a specification. Then again, I’m not a supporter of outsourcing for BI.

Agile BI emerged as a common theme among successful BI case studies when I began researching my book Successful Business Intelligence in 2007, so last year, we included this data point in the survey. Overall, agile development was identified as being not that important.



Agile sounds like the Wild West of BI with no requirements, no documentation.

If you are used to having everything highly documented with requirements precisely defined, then less formal requirements definition can seem like the Wild West. The difference is in the how and degrees. Requirements are still gathered, but perhaps through rapid prototyping, collaboratively, rather than the business writing out their specifications before they can look at any results.

martes, 1 de junio de 2010

Business Intelligence First Steps

Many small and mid-sized enterprises (SMEs) are looking for the best business intelligence (BI) solution to address their specific business problems. Whether these business pains are putting out regular fires, managing a sales force, increasing customer satisfaction, or gaining more visibility into the business and data, business intelligence is becoming the buzzword used to identify the solution used to address these problems.

Unfortunately, business intelligence on its own is not the answer to solving an organization’s business problems. The ability to effectively solve issues and develop a successful BI infrastructure depends upon the combination of the people involved and the business processes put in place. Although there are no surefire ways to ensure project success, there are things that SMEs can do and take into account when looking at starting their BI initiatives.

This article explores the first steps that SMEs should take in order to work toward BI success. The key factors that organizations must consider when looking to use BI to solve business problems and gain visibility into their business are:


1.Defining the right scope
2.Identifying, using, and managing the right data
3.Engaging the right people
4.Integrating proper project planning and management practices

Defining BI Project Success

As mentioned, the four areas listed above do not guarantee project success. However, careful consideration of these items gives companies a way to start any BI project on the right foot and put the processes in place that are required to grow and maintain a strong BI infrastructure and front-end analytics and reporting solution. Because there is so much to consider when looking at any hardware- and software-related project that deeply affects how people do business on a daily basis, taking a step back and identifying individual aspects helps simplify initiatives that require the collection of many complicated and diverse business and technical requirements.

1. Defining the Right Scope – Answering the Right Questions
The first step in any BI project is to identify the business problem. In some cases, organizations want business intelligence to solve all of their problems at once. Obviously, one of BI’s advantages is the ability to consolidate large amounts of disparate data to help companies gain a broader view of what is happening within the company. However, when looking at solving business issues and aligning strategic goals with business performance, doing it well outweighs doing it fast. Therefore, companies should identify their main business pain and start building their solution around that issue to identify general goals and metrics associated with performance management. By developing a targeted scope that addresses key business issues and starting small, organizations can work toward building a solution that meets the needs of many departments within the organization based on incremental success.

2. Identifying, Using and Managing the Right Data – Turning Data into Information
Once the scope is defined, businesses can look at what information is required. This means looking at where data resides, who accesses that data – both operationally or analytically, how often it is updated, how often it is required for reporting and analytics, the types of business rules that exist, what hardware and software it runs on, and what gaps currently exist in relation to analytics or general visibility. Although it is important to start small, organizations can identify all of the information required for the data warehouse because it is easier to identify all required data sources up front to lessen the time spent on integration activities over time.

The type of data and systems currently in use will affect the overall solution choice. Depending upon integration requirements, some solutions integrate specific types of data or information from source systems more easily, while others offer robust linking, matching and data profiling that can help with complicated data reconciliation efforts or merging various business practices into a single data warehouse. Although not always seen as important to business users, the ability to maintain data integrity on a continual basis will help ensure accurate data visibility and better decision making over time.

3. Engaging the Right People – Enter the Stakeholders
Without proper input from the people who own the data and interact with the data, there is the potential to miss key requirements when looking at developing a BI solution. Every person interacts with information differently depending upon his or her role within the company. Consequently, the requirements gathered can make the difference between project success and a solution that no one uses. To make sure that general buy-in occurs, it is important to include the relevant stakeholders in the process. Stakeholder involvement will be different in each company as many different business functions may interact with financial or sales data, or have input related to employee performance.

4. Integrating Proper Project Planning and Management Practices – Back to the Basics with Project Management
Even though not all companies use formal project management tools to manage software selection initiatives, managing projects requires some sort of formalized approach. Tracking stages and managing dependencies throughout the project life cycle helps identify whether everything is on track, how delays will affect future activities, and if the project will be completed within the proposed time frame and budget. The success of a project should not be measured by only identifying whether a project finishes on time and within budget, but implementing BI for the first time within defined parameters helps ensure support for future expansions. Within a BI environment, there are constant projects to enhance and expand solution use because of the benefits seen by companies as they begin to interact with their reporting and analytics environment.

Building BI Step by Step
These four aspects provide guidelines for organizations at the beginning of a BI project and can help lead to a greater chance of project success. Overall, organizations should realize that implementing BI requires business, technical, people, and process considerations and that any gap in one of those areas will create a hole in the overall project. Even if the first implementation breeds success, the continual use and expansion of business intelligence depends upon the cohesion of these four areas.

The ability to define and limit an initial project scope, include stakeholders within the requirements-gathering phase, and manage the project using a defined framework all fall into the areas of business, people, and processes. BI infrastructure and identifying data and how it will interrelate usually provide the bulk of what goes into preparing a BI initiative for the first time. And even though technology requirements are very important within any BI project (especially when looking at data warehousing for the first time), it is also essential not to overlook the business, people and process areas as they become a greater influence as business intelligence use starts to expand within the organization.

miércoles, 26 de mayo de 2010

Open Source BI Solutions: a Low TCO Prospect

Business intelligence is a vital component for successful business management. It introduces capabilities for effective decision-making, resulting in higher income and increased growth for the organization. BI programs must keep strategic goals and organizational missions in mind, while reducing the cost and time of implementing solutions.

Open source BI may be evaluated against the parameters of total cost of ownership, performance, scalability and user requirements.

Why is Business Intelligence important to an Organization?

Organizations can make intelligent decisions when timely information is consistently made visible to decision-makers at all levels, as this endows them with the ability to monitor important drivers of organizational performance. A well-designed BI system collects the organization’s operational data from different sources, presenting it to decision makers and stakeholders simply and meaningfully through use of a user-friendly tool. A good BI solution helps organizations gain better insight into their businesses, improve decision-making and optimize enterprise performance.


An Open Source BI Overview

Open source BI has come a long way compared to other commercial BI products, and is becoming widely recognized as an important component for enterprise-level applications. Open source BI projects such as Pentaho and Jasper have evolved from community-driven tools to viable technology with professional support for enterprise-wide adoption and witness growing demand. Organizations can use open source BI software to replace custom-coded applications. The open source BI tools can also be considered for BI components that complement the existing proprietary solution to reduce license cost. Because organizations are not locked into proprietary vendor’s platforms, open source enlarges organizational flexibility.

TCO: Critical Factor in Implementing BI Solutions

While few will deny the importance of BI, the most important factor to be considered for BI is the total cost of owning the application. The TCO concept measures costs related to the acquisition of a BI solution, its deployment and ongoing use. Though TCO estimation methods vary for different BI implementations based on requirements and business needs, certain proportions may be assumed to calculate the TCO in most projects. For example, typically staffing costs account for 50 percent while the hardware costs account for 8 percent of the project value. Based on market trend reports published by leading industry analyst organizations, the cost breakdown in Figure 1 may be assumed as the TCO breakdown for most BI implementations.

Open source helps reduce TCO on all the parameters in Figure 1. Open source BI helps in reducing costs and risks for prospective BI users. Though this does not suggest that open source BI is the right choice for every organization in every BI deployment, it can be used as an alternative for reducing BI costs if it satisfies user requirements.





Major TCO Components

Hardware: This covers the cost incurred in procuring the hardware throughout the organization, including all client machines, servers, storage solutions and networking devices attached to servers. As most software licenses are based on the number of CPUs, it directly impacts the cost of hardware. Using a scale-out approach, low-cost servers can be used to deliver open source BI solutions.

Software: The cost related to the software is one of the significant factors in the overall TCO. Open source BI is available at a fraction of cost as compared to commercial products. Open source BI customers have the flexibility to choose the components and their support level according to the requirements of the end users.

Staffing: Staffing constitutes 40 percent to 50 percent of the BI application’s cost, including the cost of resources during the analysis, development and maintenance phases. The ability of the vendor to provide the documentation and technology of expert resources makes a big impact on the TCO. Open source is based on public standards and public domain technologies.


Selection Criteria for Open Source BI Solution
Though TCO is the commonly accepted financial measure for evaluating the BI solution, factors such as user requirements, complexity of development and scalability of the solution have to be analyzed to perform the TCO calculation. The five factors that affect a BI solution are:


•BI product selection and user requirements,
•Complexity of development,
•BI project timelines,
•Product support and third party support and
•Performance and scalability.

These points can be used to compare the open source BI solution with proprietary vendor’s solutions.

BI product selection and user requirements. The objective of collecting BI user requirements is to establish the outcome of the BI solution and other aspects of the projects relating to time, cost and resources. In the case of open source BI solutions, organizations can verify the requirements without contacting the product company, because organizations can initiate a proof of concept and refine the requirements without buying the BI tools.

Complexity of development. Developing a BI solution is not only dependent on the user requirements but also on the product features and technology. As compared to proprietary vendors, open source BI products are based on the technologies available in the public domain. The resources for developing and maintaining applications are easily available. Most open source BI solutions allow a design approach in which a prototype can be done rapidly with regular testing and feedback from the BI users.

BI project timelines. Any BI solution requires orchestrated efforts by the team to complete the solution on time. Selection of proprietary and open source technology affects the human cost. While considering the open source tool, organizations must consider developers and supporting people such as database administrators and testers to understand and learn the technology. Open source BI products have simplified the use of tools and added features that can reduce the development timelines.

Product support and third party support. All open source companies provide support for the products at very low subscription prices compared to proprietary vendors. A systems integration partner is usually brought in to support the solution.

Performance and Scalability

The performance of the BI solution is dependent on factors such as data source performance, server hardware, content complexity and user requests. Most open source BI solutions support scale-up and scale-out architectures and can scale linearly.

Integration with existing infrastructure. Open source BI solutions provide a comprehensive integration interface wherein customization can integrate with the existing infrastructure. Also, this solution can be embedded on compliant servers. Information like cubes and report can be integrated through XML, HTML or JSR-168 portlets. Open source solutions are compatible with multiple operating systems.

End users and supporting personnel training. End users are business people who understand business terms. Open source solutions have the capability to put up a semantic layer that hides the complexity of the data and allows end users to exploit information using business metadata. It removes the necessity for end users to learn the coding language or syntax related to products. System integrators or open source BI companies can provide training to the support team when the BI solution moves into production.

miércoles, 28 de abril de 2010

Do business organisations need single process management infrastructure?

Last week, a friend sought my advice on whether her company should implement single process management infrastructure to automate & manage their enterprise-wide process management needs. The insurance company she works for is evaluating BPM system / application to automate travel reimbursement process. While doing so, the company is also exploring the possibility to utilise the same process management infrastructure to automate processes such as New Business process, Policy servicing process, Claims Management process, New Product Development process, etc.

Now the processes described above are different in nature and have different traits. I remembered having read an interesting process classification theory put forward by two wise men (unfortunately I do not remember their names) many years ago. They classified organisational business processed based on Business Value (Revenue Increase, Cost Reduction, Productivity / Efficiency enhancement, etc) and their Repeatability, i.e. their ability to repeat itself for every instance of the process that occurs.





As is shown in the diagram above, organisational processes can be classified into four areas:

•Production processes - with high business value and high degree of repeatability;e.g. New Business process, policy servicing process, claims management process


•Collaborative processes - with high business value but low degree of repeatability, e.g. New Product Development, Contract Formulation


•Admin processes - with low business value but high degree of repeatability; e.g. Travel Reimbursement process, Leave approval process, Conference booking process


•Miscellaneous / Ad-hoc processes - with low business value and low degree of repeatability
In my opinion, the same process management infrastructure may not be utilised to manage all the types of processes described above. There are two issues:

1.Is the BPM system capable to manage both repeatable and non-repeatable processes


2.Is it financially feasible for the organisation to manage high value and low value processes using the same BPM system
Fortunately, BPM systems have evolved over a period in time, and some of the leading BPM systems now possess dynamic process management capbility, which allow business users to alter the flow of the process even at run time, i.e. as the business process gets executed. Such BPM systems would address issue #1.

However, these BPM systems tend to be expensive requiring high end IT infrastructure. In such cases, software, hardware and implementation services costs tend to be prohibitively high to justify the utlisation of the same process management infrastructure for low value admin processes along with high value add production and collaborative processes.

So, in my opinion, organisation may have to settle for more than one process management infrastructure to manage all the enterprisewide processes. What do you think?

viernes, 12 de marzo de 2010

From BPM to Management by process

In one of my first courses about Business Process Management, the chairman, an expert in Total Quality Management, was making a major difference between Process Management and Management by Process.

For me it was just a question words. Some years after, making a return on experience from my first BPM initiative, I really understood this difference when I discovered the remedy had been worst than the disease. Let me explain why.

After this course, we decided, the Quality Director and myself as Information System Director, to instruct the managers of our company about Business Process Management.

A map of the processes was defined, and the process owners trained (more or less one in each functional department). After a first set of process attributes was established (mission, inputs, outputs and key performance indicators), objectives were defined and tracked through dashboards.

It seemed everything was perfect in the best of the worlds:

- The Quality Director could offer periodically to the top Management, a measurement of the performance on the company, not only in terms of Finances, but also on Customer Satisfaction and Internal Processes Improvement

- The IS Manager had official spokesmen from each Department to address and implement improvement initiatives through technological solutions.

So why, some years after this initiative, had the organization become more divided, with more difficulties to deliver in time, quality and cost? Wasn’t Management of the Processes supposed to provide more effectiveness and efficiency to the business?
After analyzing the situation, it appeared that only one dimension of the problem had been addressed, the vertical one. The functional organization had been reinforced.

In fact nobody was addressing the horizontal axis, neither looking for integration and coherence of the whole system.
The Processes was managed, yes; but the company was not managed by Process.

• What is Process Management ?

– Focus is put on Effectiveness (Benefits optimization)
– Improvement initiatives are local or by job categories (vertical)
– Most of IS solutions are specific to perfectly match the functional needs
– Power is in the hand of the Process Owners, who define “best of breed” solutions
I name this way the vertical axis of the business processes improvement, as it use to match with the hierarchical organization


• What is Management by Process ?

– Focus is put on Efficiency (ROI driven)
– Priority is put on results at company level
– Ad-hoc organization with leadership at top level
– Improvement axis is more horizontal, i.e. Supply Chain
– Processes are integrated with strategy (Balanced Scorecard)
– Off-the-shelf solutions are chosen (for less Total Cost of Ownership)
– Enterprise-wide view is required to communicate (Enterprise architecture)
As the initiatives are considered from an integrated point of view, I call it the global or horizontal axis.

If you have to assume some BPM responsibility, be sure you are balancing the two axies.

Most of the business process initiatives start coming from a Department which wants to solve a concrete problem first. It is a good starting point.

However, as a coordinator of the whole improvement process, you are facing a major risk: to make your organization more vertical with barriers between the Departments, which make the horizontal operations more difficult.

To mitigate this risk, I see three major actions you should lead at company level, if not implemented yet:
- "Plant your Balanced Scorecard tree" to link Business Process improvement with Strategy
- Define the value chain your organization brings to its stakeholders (customers...)
- Be the Enterprise Architect (also called city planner) of your Business

viernes, 5 de febrero de 2010

Dashboard is to envelope, as scorecard is to letter

Dashboards and scorecards are the Holy Grail of business intelligence. With either interface, users can easily and quickly find, analyze, and explore the information they need to perform their jobs. To borrow a term from the telecommunications industry, dashboards and scorecards represent the last mile of wiring connecting users to the data warehousing and analytical infrastructure organizations have created during the past decade.

Industry perceptions

But which is right for you? Although many people use dashboard and scorecard synonymously, there is a subtle distinction between them. Dashboards monitor and measure processes. The common industry perception is that a dashboard is more real-time in nature, like an automobile dashboard that lets drivers check their current speed, fuel level, and engine temperature. So, a dashboard is linked to systems that capture events as they happen, and warns users through alerts or exception notifications when performance against established metrics deviates from the norm.

Scorecards chart progress toward objectives. The common perception of a scorecard is that it displays periodic snapshots of performance associated with an organization's strategic objectives and plans. It measures business activity at a summary level against predefined targets to see if performance is within acceptable ranges. It displays key performance indicators that help executives communicate strategies and help users focus on the highest-priority tasks needed to execute plans.

So, while a dashboard informs users what they are doing, a scorecard tells them how well they are doing. Or, put another way, a dashboard is a performance monitoring system; a scorecard is a performance management system.


Reality blurs the distinctions

In reality, however, these distinctions often fall apart when we examine how organizations use dashboards and scorecards. Most dashboards provide context to evaluate performance. Even indicators on an automobile dashboard provide more than just raw data. The labels on the gauges show when you're speeding, when you need more fuel or have an engine that's overheating. Newer cars even alert drivers with sounds or lighted icons when something needs immediate attention.

Meanwhile, many scorecards provide users with more than just monthly snapshots of summary performance data. Executives use them to empower users to work more productively. The best scorecards provide actionable information--the right data delivered to the right person at the right time. There's no use charting a department's progress if the data arrives too late or without sufficient detail for users to know how to fix a problem or capitalize on a fleeting opportunity.


Using cascading scorecards

Many people believe the term cascading scorecards refers to a series of hierarchical dashboards that align individuals and groups to an organization's overarching strategy. Integrating scorecards throughout the organizational hierarchy can effectively prod the organization to focus on the real drivers of corporate value and performance. Too often, executives create strategies and send them to managers and staff, who are too preoccupied with more immediate concerns, such as meeting budget goals, to concentrate more deeply on executing strategy. Deploying scorecards throughout the organization also shows employees how their actions affect the organization's direction and performance.

Dashboards and scorecards are not mutually exclusive. In fact, the best dashboards and scorecards merge each other's elements. If dashboards don't measure performance against key business objectives, why is the organization engaged in that business activity? If scorecards don't empower users with actionable information to change performance outcomes, what's the point of keeping score?

I like to view a dashboard as the container for performance information, and the scorecard as the content in that container. Or, a dashboard is like an envelope and the scorecard a letter inside it.