Sunday, December 6, 2009

5 Things You Should Not Confuse Business Performance Management With

Apr
24
5 Things You Should Not Confuse Business Performance Management With
Filed Under (BI and Performance Management) by TEC Team (see bio) , Gabriel Gheorghiu and Aleksey Osintev


If you search for business performance management (BPM) on Google, you’ll get around 700,000 results. Out of this huge number of results, you will presumably refer to a popular source—Wikipedia. According to Wikipedia, BPM is “a set of processes that help organizations optimize their business performance.” The same source affirms that some people see it as the next generation of business intelligence (BI). Both of these explanations—unfortunately—lack clarity.

Going back to the Google search, there are a few near-synonyms for BPM that one can choose from: business intelligence performance management, performance management scorecard, key performance indicators, and business performance metrics. Similarly, Wikipedia has four synonyms for BPM as well, including corporate performance management (CPM), enterprise performance management (EPM), operational performance management (OPM), and business performance optimization (BPO).

Confused? Is it BI, a set of processes, scorecards, performance indicators, metrics, or are all these equally valid parts of BPM? Since we intend to write a series of articles on BPM, we thought we might start this thread a bit differently and first try to explain what BPM should not be confused with.

1. Business Performance Management (BPM)
There is always a kind of confusion when using the same acronym (BPM) for different software packages (i.e., business performance management and business process management). In spite of the undoubted links between these two application types, they differ greatly for the majority of software users and IT professionals. Broadly speaking, a generic business process management system allows analysts and business managers to design and model business processes in a graphical and descriptive view, then execute them, monitor the processes, and finally, modify or optimize them.

There are similarities between business process management systems and enterprise application integration software and workflow automation solutions. By the way, notice yet another BPM abbreviation here: business process modeling, which is a substantial element of business process management. This is basically a business process capturing, visualizing, and description technique (or set of techniques) that provides companies a clear view on processes and helps them to analyze these processes in order to improve them.

2. Business Intelligence (BI)
Is BI part of BPM? Definitely! You can make any kind of business decision based on accurate information, and the efficient way to get that information is through a BI tool. Still, BI is not enough. The best BI tool in the world can give you the greatest dashboards, graphs, ad hoc reports, and so on, but they are completely useless unless you have a good idea of what to do with them.

It is safe to say that BI is the framework or the tool that will help you improve your business, but it will not complete this task for you. This is where BPM comes into play. A BPM provider should be able to support you in defining your business processes and objectives, as well as the metrics or key performance indicators (KPIs) you need to follow. Furthermore, your BPM provider will assist you in building the tools you need in order to extract the right data from the right place and then interpret it according to the already defined objectives.

3. Balanced Scorecard, Business Process Measurement, and Key Performance Indicators
When talking about business performance management, we should clearly understand that it is possible to successfully manage “something” as long as that “something” can be measured. In other words, in order to estimate how well your business is doing, some formal methodologies, criteria, and metrics are required.

However, it is not enough to estimate your company’s achievements using financial criteria only. There are other important activities which (while difficult to quantify and evaluate) are necessary to compare and evaluate in order to have a more complete picture. Balanced scorecard, business process measurement, and KPIs were developed as a systematic approach to help managers of all levels effectively control the company or departments within the company and to be able to quickly react to market and environmental changes and challenges.

These three concepts are really closely related to each other, but represent different views of the same process. Balanced scorecard is used mostly by the top management level of a company to monitor overall business performance towards strategic goals of the company. Mid-level and operation management usually use business process measurement parameters to visually examine routine and day-to-day processes towards short-time or current goals of the department or organization. Both of these methods utilize KPIs as a metric to count and analyze countable and often uncountable criteria. Those indicators usually look like set of diagrams and graphs that fluctuate dynamically depending on how the numbers change. Sometimes these sets of diagrams are called dashboards (using the analogy of a car or plane dashboard with a number of gauges on them).

4. Total Quality Management (TQM), Lean, and Six Sigma
At a first glance, these mechanisms, methodologies, and concepts can be referred to as different types of business process management. They reflect different views of the same core business processes improvement and talk about product, process, customer satisfaction, quality, and practical techniques to plan, organize, and control this process. They all consider business processes improvement as a global strategic goal and, as a result, companies achieve better financial numbers.

Certainly they are not the same things. While there are plenty of books, articles, and Web sites available to help readers understand the concepts, at the same time the non-dedicated reader who isn’t a professional in these concepts can easily become confused in this ocean of information.

Generally speaking, total quality management, lean, and six sigma as methodologies are much wider and deeper in substance than business performance management—which is a very useful and helpful way to estimate the current business and financial situation of an organization, as well as providing food for thought for managers at all levels to assist them in optimal decision making.

5. Reporting and Analytics
An in-depth explanation of the difference between BI versus reporting and analytics exceeds the scope of this post. So we’ll make this part short but sweet: analytics is complex reporting, while BI is a sophisticated reporting and analytics tool.

Most accounting, enterprise resource planning (ERP), customer relationship management (CRM), supply chain management (SCM), product lifecycle management (PLM), solutions offer reports, and most of them even allow you to do analysis on sales, purchases, productivity, and more. As our jobs are becoming very information-intensive, reporting, analytics, and BI are essential to today’s workforce.

Reporting and analytics tools do not always provide data in a format that can be used by a BPM product. Oftentimes, information comes from a variety of sources and—just to make things worse—different tools are used to extract it. A BPM tool should be able to gather all the required data from all available sources and convert it into a format that can be used in the decision process.

To Be Continued…
Five years ago, the BPM Standards Group was created by IBM, SAP AG, Hyperion Solutions Corp., IDC, Meta Group, The Data Warehousing Institute, and BPM Partners Inc. One of its goals was to properly define BPM and to create standards for it.

Friday, December 4, 2009

Vendor Rating and Certification Updates: BI and ERP

It’s mid-November and time to tell you about some of the new product ratings and certifications that we’re covering in our research. TEC analysts recently completed certifying products from BatchMaster and Targit.

Each vendor successfully demonstrated how its product addressed a script of functionality as identified by TEC analysts. (Look for products proudly wearing the TEC certification badge in our evaluation centers and vendor showcase.)

* The Targit BI Suite, with its “few click” approach is covered in our BI Evaluation Center.
* The BatchMaster Enterprise solution focuses on the requirements of companies in the process manufacturing industry, as covered in our ERP Evaluation Center.

In addition to those TEC certified products, we also published new data about the following products.

* Newly revised data on the OpenAir professional services automation suite.
* Webcom joined our Business Process Management (BPM) Evaluation Center with the submission of its ResponsAbility product.
* Software development and QA company, Technosoft, joined our outsourcing evaluation center.

Straight Up on Leads Management

No, I’m not about to launch into a Paula Abdul cover (I won’t even dignify that with a link).

Lead generation is a process that uses information to create interest in an enterprise’s products or services. It’s end objective is to generate sales.

Several steps are involved in this marketing process. Before a company begins, it needs to define the market that its product or service caters too, segment that market, and then identify its most profitable areas. Once this is done, the leads generation process begins. The leads generation process involves prospecting, preapproach, approach, and close. As a prospect moves through the leads cycle, information is being created and filtered. Sensibly, a business should use this information to follow up with its customers to see if they were satisfied with the service or product, and then generate leads metrics which will be used to further refine the leads generation and sales process.

The leads generation process gathers a lot of information and involves a lot of tracking, and it should generate dialogue not only between the company and customers, but within the company between sales and marketing in particular. A leads management solution uses different methodologies and practices to govern this information and distribute it to the appropriate people within an organization.

There are a couple of factors that are spurring the need for effective leads management tools. The biggest factor is that consumers are becoming more savvy, and are not easily compelled by traditional marketing. Companies are seeking to effectively target their core market by catering to their target’s specific needs.

The following white paper by BLUEROADS (original caps), outlines a some of rules that vendors should adhere to when managing leads distribution. Some recommendations include

* Using clear terminology for each stage of the lead pipeline
* Using partners that are relevant and experienced in a particular area.
* Having realistic expectations.
* Using lead pull methodology.

Given this, enterprises need to find software that is appropriate to their needs. In his excellent blog, Brian Carroll points readers to a Forrester Marketing blog by Laura Ramos which highlights four key buckets of leads generation technology, aimed at improving the efficacy of leads generation. I’ll repeat them here (but I do encourage you to visit both sites)

1) web analytics
2) database services
3) marketing automation
4) pure play leads management

Needless to say this involves a lot of technology and integration with existing CRM and SFA systems. On its own, a leads management system will not be a panacea for a business’s slumping sales. On this, Carroll reflects

“Software will not spontaneously generate collaboration between sales and marketing…I regularly encounter organizations that invest in expensive software before they fully understand the fundamental operational processes that it will be supporting.”

In other words, enterprises do not appreciate the type of information they need and who will be using it within the company. (He also writes how his company spent over a million dollars and nearly a decade to almost perfect their current leads management system. Brian, if you’re reading this, I invite you to try TEC’s tool…) A good leads management system is one that is used. There must be management buy-in, and the sales and marketing teams must be diligent in imputing and extracting information. For stakeholders to use the system, it must offer tools that they need. Failing this, money and resources are wasted housing dirty data—data which has no form or function outside of confounding business.

Enterprises should use a decision support system to help them map out their needs and measure their priorities. The decision process itself can be long and arduous if it is not managed correctly (It’s detailed here as a part of TEC’s software evaluation and selection methodology)

For different vendor’s take on leads management issues, visit our white paper site.

Business Solutions of the Future

The future is tomorrow’s present. Many have tried to predict it using silly or scientific methods, from chiromancy (palm reading), aleuromancy (fortune cookies), and other -mancies, to the three Ps (possible, probable, and preferable futures) and a W (or wildcard—low-probability events with a high impact on the future) used in futurology.

Without trying to create a “CRMorology” or “ERPmancy”, I aim to write a series of articles about the future of business software. Since this concerns everyone—and because I’m not Nostradamus or Hari Seldon (Asimov’s famous psychohistorian)—I would like to involve you, our readers, as well as business professionals and decision makers from the enterprise software industry. From students with little knowledge (but extraordinary imagination), to analysts who know everything about the market and vendors who know for sure what will NOT happen in the near future, I need you to let me know how you see the future of business applications.

How Is It Going to Be?

A future in which business applications will not be needed is too far-off to foresee, so that will not be discussed here. So, if we can’t live without these applications, how will they evolve? Will there be one huge, global business software provider, employing armies of programmers and customer support people? Or maybe myriads of open source products that will work together and be as easy to assemble as the pieces of a puzzle?

I guess we could let our imagination wander indefinitely, but let’s get a bit organized here: what we’ll aim for is seeing what could possibly happen in the next ten, fifty, and one hundred years.

Ten Years in the Future

A decade is not such a long time, so it should be easier to foresee the major trends in the business software industry that may happen during this time span. Still, even in the short term, this is quite a challenge. Look at meteorology: the weather changes so fast and unexpectedly that we can only know for sure what it is after it has happened.

Speaking of meteorology, I see some clouds gathering above the world of enterprise resource planning (ERP). Is there going to be a storm? No, they say cloud computing is the alternative to the traditional storage of information—instead of storing the data on a server in your company, you can put it on data centers anywhere in the world. Some don’t believe it will work, but it was not so long ago that people used to keep their money under the mattress because they did not trust banks.

We don’t trust banks today either, but we do use e-banking and credit cards. The same thing will happen with the clouds: their utility and efficiency will eventually be stronger than the fear of losing data. They already exist and the biggest in the world has 150 locations and will store 150 million gigabytes (GBs) every year, or 100 GBs every four minutes.

We will probably have sufficient space for the data, but what about its security? According to a study from Oracle, twenty percent of IT managers think that data security breaches will happen at their organizations in the next year. And the main threats are not from viruses and hackers, but mostly from inside the company. Do we need an occurrence of massive, worldwide data loss to learn from our mistakes (as we supposedly do now), during the economic downturn?

Let’s say we store and secure the data—how do we access it? It doesn’t look like a problem now, but it will surely become one—maybe sooner than we think. According to an IDC report, we created 281 billion GBs of data in 2007, and by 2011, that number will increase to 1,800 billion GBs.

While we do have more sophisticated tools to extract and manipulate data, one of the challenges of the future will be to have structured data. This involves the existence of workflows for data creation and administration, data cleansing, and data deduplication (removal of duplicate records).

Business data is created by users through an interface to a database. Despite the fact that all enterprise software vendors pretend to offer “intuitive” and “user-friendly” solutions, the complexity of these tools keeps on growing. Since the trend seems to be grouping several solutions in the same suite—most of the time, from different providers acquired by the same vendor—integration seems more important than innovation.

Some vendors offer a platform as a service (PaaS) (also known as cloudware), which is aimed at helping customers easily design, develop, and test their own applications. Large companies can benefit from PaaS, as it will allow them to create applications tailored to their complex needs, thus reducing costs. On the other hand, once you choose to use a PaaS platform, transition to another platform becomes very difficult, and potentially impossible. Will the emerging open platform as a service (OPaaS) address this problem by letting programmers use whatever tools and languages they need?

The way we work will also change. According to a study conducted by Accenture, by 2013, seventy percent of mobile phones in developed nations will support Internet browsers. The same report reveals that the millennial generation (people born in the last decade of the twentieth century) will change the face of the workforce.

Friday, November 6, 2009

Hooking ERP Up with MES: Good, But Not Sufficient Yet

Without such a tight and near real-time integration, there is much anxiety and frustration within any enterprise that is in search of a more competitive, profitable, safe, and agile factory. How can any manufacturing company reduce non value-adding administration and empower their workforce to take immediate remedial actions?

Namely, the typical current state of affairs from the perspective of a senior vice president (SVP) of operations could be summarized as follows:

1. On one hand, the ever more pressured manufacturing environment demands acceleration of the stock-keeping units (SKU) mix and shorter lead times, all due to ever more demanding and fickle customers; but
2. On the other hand, the real world situation is of little overall enterprise and/or SKU-level profit visibility, and the company has to rely on (suboptimal) average key performance indicators (KPIs), with emergency scheduling (constant firefighting) on paper or Excel documents.

In such “clueless” environments, there are “blind spots” everywhere in terms of determining yields and losses, hidden capacity opportunity, and masked process routing and constraints by reactive work practices. Also, there are increased risks of quality non-compliance leading to manual quality assurance (QA) processes, whereas continuous improvement efforts are floundering and remain unmeasured. In a nutshell, the hands-on plant people do not seem involved and are ironically not accountable for what they should be.

The future state should logically be the inverse of the above, and the usual “first remedial step conclusion” is to gather the glut of data from data historians and MES databases, and then decide what to do. But, without smart and intuitive plant applications that have visualization and contextual business intelligence (BI) capabilities (and that are thus accepted by the plant staff), this will all be yet another exercise in futility.

The reality check reveals an “inconvenient truth” that many MES investments fail to deliver hoped for performance management outcomes due to people issues. Namely, after 18 months or so, the embattled company in case might have an overall equipment effectiveness (OEE) dashboard that the plant engineers occasionally look at (and which might have cool colors on it), but without a pervasive effect (actionable info) and acceptance across the plant (and entire enterprise).

Curbing MESsy Shop Floor State of Affairs – Part I

Those that follow manufacturing-oriented enterprise applications have likely noticed for some time an uptick of conversations about the need to better integrate high-speed manufacturing operations (the real-time world of the plant) with the planning and engineering departments (the transactional and design world of enterprise systems). The nirvana (or utopia) hoped for thus far has been to provide a single point of operation and control for manufacturers to: Plan, Define, Control, Execute, and Analyze Production.

Why do we need integrated manufacturing operations, or manufacturing execution systems (MES) linked to transactional enterprise resource planning (ERP) systems, likely via some plant-level integration hub and visualization & intelligence layer?

Well, it is not a major revelation to say that, for instance, in the discrete manufacturing sector, fabrication and assembly processes are being run and managed by isolated applications, such as “Post-it” notes, Microsoft Excel spreadsheet, Microsoft Access databases, and a plethora of niche vendor’s plant applications (point solutions like data historians). This creates an overwhelming number of individual silos (or islands) of manufacturing data and operations.

These silos are typically not connected to enterprise-level (“ivory tower”) systems like ERP, Computer-Aided Design (CAD), Product Lifecycle Management (PLM), and so on. This lack of integration and real-time connection then all too often results in huge operational inefficiencies, lost productivity, wasted time and materials, sub-par products, and so on.

Consequently, major decisions in the offices are based on theory and hunch rather than on actual and actionable data. But instead of traditionally managing operations “in the dark”, companies should rather strive to capitalize on all of the operational opportunities coming from the following sources: people, processes, and the plant equipment.

ERP Does It… Not!

Some might logically wonder whether ERP systems can take care of this (and why not, if that is the case). Well, at best, the core ERP systems’ functional scope only provides a financial and inventory snapshot of how a manufacturer is performing. Core ERP systems cannot tell users what is happening on the manufacturing floor right now (at this instant). Ironically however, what is happening at this moment impacts the financial performance later.

To be more illustrative, ERP is good at producing a forecasted demand plan by decision makers, and giving answers to sales, purchasing, and manufacturing orders’ inquiries like “What?”, “When?”, “For whom?”, and “At what cost?” Conversely, MES is good at providing the record of production that is supplied by plant operators (e.g., engineers, supervisors, machinists, etc.), who thereby inadvertently turn into mere data collectors.

The execution system is able to provide answers to the questions like “What are the schedule changes?”, “What is the product build history?”, “When will it be done?”, “How is the product quality?”, and “Where is the batch?”, but without any awareness of the customer (the particular order for that customer) or the particular order costs. In a nutshell, MES systems are devoid of any customer- and order-related information.

Curbing MESsy Shop Floor State of Affairs – Part II

MES solutions that integrate seamlessly into existing enterprise applications thus connect manufacturing to the enterprise in order to:

* Reduce costs and improve profits by collecting and communicating real-time manufacturing data throughout the product lifecycle; and
* Closely control and continuously improve operations, quality, and visibility across facilities worldwide.

By standardizing the best practices of lean manufacturing, overall equipment effectiveness (OEE), and continuous process improvement (CPI), such solutions should provide a real-time framework that would unite capabilities like finite factory scheduling (constraints-based), operations, quality, safety, performance management (via analytics), and enterprise asset maintenance (EAM).

Plant-level execution systems have thus far largely been adopted by big companies in a big way. The historic condition in this highly fragmented market was that offerings were too niche-oriented and offered by many small software companies. A large enterprise would have to purchase many offerings and stitch them together to get a full solution. Today, however, comprehensive packaged factory solutions that are repeatable, scaleable, and transferable are changing that dynamic.

Some Shining Examples

Some good examples in this regard would be a rare few ERP vendors with native MES capabilities, starting with IQMS and its EnterpriseIQ suite [evaluate this product]. Mid-2008, IQMS launched a new Automation Group to expand the interface capabilities of its EnterpriseIQ ERP system with manufacturing equipment on the shop floor.

Look for a separate article on IQMS down the track. In the meantime, you can find more information about the vendor here and in TEC’s earlier article entitled “Manufacturer’s Nirvana — Real-Time Actionable Information.” Also, there is an informative Enterprise Systems Spectator’s blog post on IQMS here.

Solarsoft (formerly CMS Software [evaluate this product]) would be another good ERP-MES example following the acquisition of Mattec a couple of years ago. The upcoming Epicor 9 product, which represents a complete rewrite and convergence, on the basis of service-oriented architecture (SOA) and Web 2.0, of the selected best-of-breed functional concepts from the respective individual products (like Epicor Vantage [evaluate this product], Epicor Enterprise [evaluate this product], Epicor iScala [evaluate this product], and so on) will feature the native MES module. Of course, some functionality within Epicor 9 will be brand new, while some modules will represent embedded third-party products (unbeknownst to the customer).

Curbing MESsy Shop Floor State of Affairs – Part III

As for the user interface (UI), it is extremely critical that it match the worker’s job. There is a saying that “the worker works the way the worker wants to work,” and in any plant-level system the role-tailored and industry-specific UI is incredibly important to streamline, or “lean out,” the tasks that are required.

This is why ERP systems have customarily been so poor at handling manufacturing execution: not necessarily that the functionalities don’t exist in some ERP manufacturing offerings, but that it is too difficult for workers to input the necessary information. I fully agree with AMR Research’s 2007 alert entitled “CDC Software Delivers Operations Excellence in Plain English” that states:

“…Knowledge workers on the shop floor can’t waste time—downtime or overtime—filling out forms or navigating complex screens and menus to accomplish their goals…

…Role-based workflows, stored procedures, and no-nonsense user interfaces are designed to guide operations personnel through common scenarios. The UI is also designed to work with touch-screen interfaces, supporting simplified and rapid data-entry scenarios. This is a far cry from the historically cumbersome multi-page, multi-tab interfaces offered by traditional ERP systems.”

Such intuitive and engaging technologies might even help to overcome traditional cultural barriers between the enterprise and the plant, which are well depicted in the recent Optimal Solutions’ article:

“…Many plants, particularly those over a decade old, began as quasi-independent entities. As these plants evolved, a culture of isolationism often took hold.

Inside manufacturing plants, engineering, operations and information technology (IT) established fiefdoms. Over time, plant engineers, plant managers, process control operators and machinists learned to execute their respective functions while respecting each other’s turf. Overly intrusive corporate oversight was often kept at bay by hitting production numbers and keeping costs under control…”

AMR Research’s Manufacturing Peer Forum members say that unlocking the potential of plant operations personnel and letting them take ownership of the continuous improvement process (CIP) has been highly successful in operations.

Enter CDC Factory

This brings us to CDC Software’s CDC Factory, which is a packaged manufacturing execution and operations intelligence management system that transforms manufacturing performance by enabling operations people to take immediate action. The suite is aimed at, either in tandem with the sibling Ross ERP suite [evaluate this product] or any other ERP product for that matter, catering to the needs of midsize manufacturers in the food and beverage (F&B), pharmaceutical, and consumer packaged goods (CPG) industries.

Although CDC Factory is ERP-agnostic, future developments are meant to leverage the capabilities from Ross ERP and vice versa. Accordingly, the end of 2008 saw the Ross Factory module extending and expanding value for Ross ERP customers.

Along similar lines, in early 2009 CDC Factory should feature Trace Express, the Ross ERP system’s ability to trace orders and materials forward from suppliers and backward from customers with audit trails. In late 2008, the Ross BPM (Business Performance Management) 6.3 product was launched with a new UI, Microsoft SharePoint portal, and document management integration, which will be extended to CDC Factory at a later stage.

The CDC Factory users’ empowerment and involvement is encouraged via a factory floor UI for capturing all real-time inputs, outputs, information, alerts, decision-making process and actions. Different plant roles (e.g., executives, managers, supervisors and operators) will have somewhat differently tailored screens serving the respective knowledge-worker. But in all cases, the UI features the Microsoft Office familiarity combined with consumer-style screens like automatic teller machine (ATM) or restaurant/retail shop point of sale (POS) terminals.

For example, a supervisor will see the following: many pertinent key performance indicators (KPIs) and other analytical information such as plant dashboards; the end of the shift summary; the end of the production run summary; top downtime reasons; quality adherence; tactical analysis, and so on.

Many best practices were built into the product as a natural extension of daily activities to facilitate managers and shop-floor workers to suggest and/or leverage common-sense CIP techniques. These best practices can be in terms of working practices (e.g., standard operating procedures [SOPs] or good manufacturing practices, role accountability, etc.), key comparative metrics, comparable manufacturing processes, standard definitions, standard data (e.g., coding, failure codes, etc.), and so on.

Also incorporated are the business performance management (BPM), analysis, and workflow capabilities normally associated with total productive maintenance (TPM), total quality management (TQM), Six Sigma, and other lean manufacturing toolsets. All this without necessarily requiring the operators’ intimate knowledge of this trendy academic terminology (which might sound like Greek to many plant folks).

For example, comparative factory KPIs would be: the factory’s overall equipment effectiveness (OEE) trends, manufactured units per man-hour across plants, cost per unit (by product and/or product type), and on-time delivery. For its part, plant performance KPIs would be: performance by shift/product, plan attainment, downtime percentage, labor variance, and waste percentage. Finally, operation intelligence and analysis could provide KPIs like the categorization of downtime, reasons for yield loss, or reasons for slow running.

Open Source and Business Intelligence: The Common Thread

"Open source applications" is the term that describes systems built using open source software in the form of frameworks or libraries. Although copyleft licenses do not permit organizations to resell software developed using open software, mechanisms such as dual-license models have arisen, whereby commercial vendors can deliver their software under a community license that follows the open source license regulations and offers a commercial license with an attached fee. Vendors may charge users for services such as support, training, consulting, and advanced features.

In the past two years, commercial open source vendors have been working actively towards establishing a long-term position in the enterprise applications space. In February 2007, the Open Solutions Alliance (OSA) was formed to bring together commercial open source software businesses; its main purpose is to broaden the horizon of open source applications and most importantly, foster interoperability between them. JasperSoft, one of the pioneers of open source BI is among the founding members of this alliance. Pentaho, another open source BI vendor, has set itself apart by leading and sponsoring all of its core projects, implementing open industry standards and establishing partnerships with vendors of data warehouse technology, such as InfoBright and ParAccel.

BI has some of the most challenging technology problems among all enterprise software applications. These challenges include the design of very large databases; complex data integration between disparate and multiple data sources; the ability to search across a surfeit of information; and some of the most stringent performance and latency requirements. Even with proprietary solutions, organizations need a team of experienced professionals—including database administrators, business analysts, and programmers—to implement and support a data warehouse and BI environment.

Open source BI goes one step further: it encourages organizations to use and modify the software as needed and share advances with the rest of the community. It seems only natural that open source and BI technologies have converged. A crucial factor to consider when adopting an open source BI solution is that underlying technologies are often, if not always, open source themselves; although not mandatory, it is prudent to have technical teams acquire the necessary skills. For instance, most open source BI software is built on the LAMP stack. In order to adopt and maintain the applications, technical teams need to have development and administration skills using the LAMP stack.

Extensible Business Reporting Language (XBRL) Back in the News Again

Visiting the Securities and Exchange Commission’s (SEC’s) web site, I came across this 143-page PDF file, which deals with XBRL. As a gung-ho proponent of automation, I’m calling attention to it here to show that the head of the SEC (Mr. Christopher Cox) and I are on the same wavelength when it comes to promoting cost saving automations. Here is some interesting stuff from the PDF, together with my comments.

The Christopher Cox modernization commission proposes that companies provide their financial statements to the Commission and on their corporate Web sites in interactive data format using the eXtensible Business Reporting Language (XBRL). A statement on Page 29 of the PDF states that the decision about this request becoming compulsory was to have been decided on December 15th, 2008.

XBRL was derived from the XML standard. It was developed and continues to be supported by XBRL International, a collaborative consortium of approximately 550 organizations representing many elements of the financial reporting community worldwide in more than 20 jurisdictions, national and regional.

The proposal is a significant one for Mr. Cox as he is a proponent of modernization and the use of current technology to reduce business and government expenses. His legacy will be the promotion of “interactive data” and modernization of SEC filings through the use of XBRL.

Mr Cox has decided to retire at the end of President Bush’s term. During Mr. Cox’s tenure, the regulator has convinced over 8,000 companies to use XBRL in various types of filings. Large international organizations such as Proctor and Gamble and Pepsi file with GAAP and IFRS, are using XBRL, and are on the very pro XBRL bandwagon.

Sterling Software Sees the Light with Eureka:Intelligence

MINNEAPOLIS, Nov. 15 /PRNewswire/ -- Sterling Software, Inc. (NYSE: SSW) announced general availability today of its new EUREKA:Intelligence product. EUREKA:Intelligence is a Web-Based Integrated, Query, Analysis and Reporting tool that allows users to easily query, manipulate and format data for personal or shared use. It integrates the most commonly used business intelligence capabilities into a single tool, making it valuable for the majority of business users. EUREKA:Intelligence is ideally suited for large, distributed organizations that require general purpose analyses reporting requirements for a large portion of business users. The 100% Java tool also satisfies the unique requirements of business-to-business e-commerce companies.

EUREKA:Intelligence is the latest product to be added to the integrated EUREKA:Suite. The suite also contains products for web-based production reporting, analysis of very large databases, and advanced ad-hoc analysis. In addition, EUREKA:Portal provides a single point of entry and is a common platform of services for all EUREKA:Suite products. Since EUREKA:Intelligence is 100% Java, it provides cross-platform support. The client code is maintained and deployed automatically by the EUREKA:Intelligence server, so there is no need to install or update client code. The product has multiple analytical view modes including chart, pivot, table, and report document. There is also a scheduling capability to allow users to monitor performance over time. Time-series groupings of similar archived reports can be used to generate historical information.

Tuesday, October 20, 2009

Analysis of SAS Institute and IBM Intelligence Alliance

"At IBM's PartnerWorld 2000 in San Diego this Monday (24Jan00), SAS Institute and IBM will announce a new business intelligence relationship that will include the formation of consulting practices focused on SAS solutions, and further development of e-business intelligence solutions that integrate IBM's DB2 database product family and SAS software.

The announcement between the two business intelligence leaders is the latest in a select group of key strategic relationships forged by IBM as it refocuses its partnering efforts to provide world-class e-business applications. Recent announcements have included partnerships with other leading software providers such as Siebel Systems and SAP AG.

The agreement between IBM and SAS Institute and the planned joint development efforts will result in:

* Creation of a consulting practice in IBM Global Services specializing in SAS solutions. These consultants will work with joint customers to integrate the powerful decision support capabilities of SAS solutions with existing transaction systems and other e-business applications.

* Closer integration of SAS solutions and DB2 Universal Database to enhance performance for all IBM server platforms, including Netfinity, AS/400, RS/6000, NUMA-Q and S/390.

* IBM Global Services' access to a wide range of SAS Institute solutions for business intelligence, data warehousing, and decision support.

The relationship will initially focus on three primary areas where IBM and SAS Institute will offer end-to-end solutions to enterprise customers. IBM Global Services will provide the analytical services, systems integration and industry-specific consulting expertise. SAS Institute will provide software solutions for Customer Relationship Management (CRM), Enterprise Resource Planning (ERP), and Supplier Relationship Management (SRM). IBM and SAS Institute plan to more tightly integrate and thus enhance performance of DB2 Universal Database and SAS software."
The existing customer base for IBM DB2 Universal Database should be strongly interested in this development. The ability to access solutions for customer relationship management and extended supply chain solutions should be especially intriguing. We believe that the combination of SAS's strong business intelligence solutions and IBM's global sales and consulting forces will make a powerful combination. The question for customers will be whether this is just a marketing alliance or an actual combination of powerful products at the code level, allowing customers to seamlessly integrate the products.

Using Predictive Analytics within Business Intelligence: A Primer

Predictive analytics has helped drive business intelligence (BI) towards business performance management (BPM). Traditionally, predictive analytics and models have been used to identify patterns in consumer oriented businesses, such as identifying potential credit risk when issuing credit cards, or analyzing the buying habits of retail consumers. The BI industry has shifted from identifying and comparing data patterns over time (based on batch processing of monthly or weekly data) to providing performance management solutions with right-time data loads in order to allow accurate decision making in real time. Thus, the emergence of predictive analytics within BI has become an extension of general performance management functionality. For organizations to compete in the market place, taking a forward-looking approach is essential. BI can provide the framework for organizations focused on driving their business based on predictive models and other aspects of performance management.

We'll define predictive analytics and identify its different applications inside and outside BI. We'll also look at the components of predictive analytics and its evolution from data mining, and at how they interrelate. Finally, we'll examine the use of predictive analytics and how they can be leveraged to drive performance management.

Overview of Analytics and Their General Business Application

Analytical tools enable greater transparency within an organization, and can identify and analyze past and present trends, as well as discover the hidden nature of data. However, past and present trend analysis and identification alone are not enough to gain competitive advantage. Organizations need to identify future patterns, trends, and customer behavior to better understand and anticipate their markets.

Traditional analytical tools claim to have a 360-degree view of the organization, but they actually only analyze historical data, which may be stale, incomplete, or corrupted. Traditional analytics can help gain insight based on past decision making, which can be beneficial; however, predictive analytics allows organizations to take a forward-looking approach to the same types of analytical capabilities.

Credit card providers offer a first-rate example of the application of analytics (specifically, predictive analytics) in their identification of credit card risk, customer retention, and loyalty programs. Credit card companies attempt to retain their existing customers through loyalty programs, and need to take into account the factors that cause customers to choose other credit card providers. The challenge is predicting customer loss. In this case, a model which uses three predictors can be used to help predict customer loyalty: frequency of use, personal financial situations, and lower annual percentage rate (APR) offered by competitors. The combination of these predictors can be used to create a predictive model. The predictive model can then be applied and customers can be put into categories based on the resulting data. Any changes in user classification will flag the customer. That customer will then be targeted for the loyalty program. Financial institutions, on the other hand, use predictive analytics to identify the lifetime value of their customers. Whether this translates into increased benefits, lower interest rates, or other benefits for the customer, classifying and applying patterns to different customer segmentations allows the financial institutions to best benefit from (and provide benefit to) their customers.
Data mining can be defined as an analytical tool set that searches for data patterns automatically and identifies specific patterns within large datasets across disparate organizational systems. Data mining, text mining, and Web mining are types of pattern identification. Organizations can use these forms of pattern recognition to identify customers' buying patterns or the relationship between a person's financial records and their credit risk. Predictive analytics moves one step further and applies these patterns to make forward-looking predictions. Instead of just identifying a potential credit risk, an organization can identify the lifetime value of a customer by developing predictive decision models and applying these models to the identified patterns. These types of pattern identification and forward-looking model structures can equally be applied to BI and performance management solutions within an organization.

Predictive analytics is used to determine the probable future outcome of an event, or the likelihood of a situation occurring. It is the branch of data mining concerned with the prediction of future probabilities and trends. Predictive analytics is used to analyze automatically large amounts of data with different variables, including clustering, decision trees, market basket analysis, regression modeling, neural nets, genetic algorithms, text mining, hypothesis testing, decision analytics, and so on.

The core element of predictive analytics is the predictor, a variable that can be measured for an individual or entity to predict future behavior. These predictors are based on models that are created to use the analytical capabilities within the generated predictive models. Descriptive models classify relationships by identifying customers or prospective customers, and placing them in groups based on identified criteria. Decision models consider business and economic drivers and constraints that surpass the general functionality of a predictive model. In a sense, statistical analysis helps to drive this process as well. The predictors are the factors that help identify the outcomes of the actual model. For example, a financial institution may want to identify the factors that make a valuable lifetime customer.

Multiple predictors can be combined into a predictive model, which, when subjected to analysis, can be used to forecast future probabilities with an acceptable level of reliability. In predictive modeling, data is collected, a statistical model is formulated, predictions are made, and the model is validated (or revised) as additional data becomes available. One of the main differences between data mining and predictive analytics is that data mining can be a fully automated process, whereas predictive analytics requires an analyst to identify the predictors and apply them to the defined models.

A decision tree is a variable within predictive analytics that allows the user to visualize the mapping of observations about an item and compare it to conclusions about the item's target value. Basically, decision trees are built by creating a hierarchy of predictor attributes. The highest level represents the outcome, and each sub-level identifies another factor in that conclusion. This can be compared to if-else statements, which identify a result based on whether certain factors meet specified criteria. For example, in order to assess potential bad debt based on credit history, salary, demographics, and so on, a financial institution may wish to identify multiple scenarios, each of which is likely to meet bad debt customer criteria, and use combinations of those scenarios to identify which customers are most likely to become bad debt accounts.

Regression analysis is another component of predictive analytics that allows users to model relationships between three or more variables in order to predict the value of one variable in comparison to the values of the others. It can be used to identify buying patterns based on multiple demographic qualifiers such as age and gender which can be beneficial to identify where to sell specific products. Within BI, this is beneficial when used with scorecards that focus on geography and sales.
Practical applications of all of these analytical models allow organizations to forecast results to predict financial outcomes, hopefully increasing revenues in the process. Within BI, aside from financial outcomes, predictive analytics can be used to develop corporate strategies throughout the organization. What-if analyses can be performed to leverage the capabilities of predictive analytics to build various scenarios, allowing organizations to map out a series of outcomes of strategic and tactical plans. This way, organizations can implement the best strategy based on the scenario creation.

How Predictive Analytics Are Used within BI, and How They Drive an Organization's BPM

Data mining, predictive analytics, and statistical engines are examples of tools that have been embedded in BI software packages to leverage the benefits of performance management. If BI is backward looking, and data mining identifies the here and now, predictive analytics and their use within performance management is the looking glass into the future. This forward-looking view helps organizations drive their decision making. BI is known for its consolidation of data from disparate business units, and for its analysis capabilities based on that consolidated data. Performance management goes one step further by leveraging the BI framework (such as the data warehousing structure and extract, transform, and load [ETL] capabilities) to monitor performance, identify trends, and allow decision makers the ability to set appropriate metrics and monitor results on an ongoing basis.

With predictive analytics embedded within the above processes, the metrics set and business rules identified by organizations can be used to identify the predictors that need to be evaluated. These predictors can then be used to shift towards a forward-looking approach in decision making by using the strengths from the areas identified above. Scorecards are one example of a performance management tool that can leverage predictive analytics. The identification of sales performance by region, product type, and demographics can be used to define what new products should be introduced into the market, and where. In general, scorecards can graphically reflect the selected sales information and create what-if scenarios based on the data identified to verify the right combinations of new product distribution.

What-if scenarios can be used within the different visualization tools to create business models that anticipate what might happen within an organization based on changes in defined variables. What-if analysis gives organizations the tools to identify how profits will be affected based on changes in inflation and pricing patterns as well as the impact of increasing the number of employees throughout the organization. Online analytical processing (OLAP) cubes can be created to identify dimensional data, and patterns within changing dimensions can be compared over time to contrast scenarios using a cube structure to automatically view the outcome of the what-if scenarios.

Marketing and Intelligence, Together at Last

Angara offers an ASP-based service for targeting web site content to unidentified visitors (see article, "Getting Strangers to Take Your Candy"). The company buys online profile data from other websites. These are data that users agree to provide in exchange for receiving newsletters or other offers or are captured from clickstreams by online advertising networks such as MatchLogic.

By arrangement with the websites Angara gets to drop a cookie - but not any data that might identify the user as an individual. When the user later visits an Angara customer, Angara can provide segmentation information such as age, sex, or geographic region. The customer's website uses the segmentation information to serve targeted content to the visitor.

In the case of data from ad agencies, Angara is given access to the cookies dropped by the agencies. In both cases the data only identify broad characteristics of the user, such as sex, interests and responses to categories of advertising. The goal is to make first time visitors more likely to make purchases or return to the site.

Net Perceptions specializes in analysis of existing customers. The company sifts through data on the viewing and purchasing behavior of shoppers and uses its conclusions to make recommendations for personalized offers and targeted ads. Net Perceptions has chosen Angara to complement its own offerings in its new ASP offering, called the Net Perceptions Personalization Network.

The Personalization Network will offer four "channels," each driven from databases compiled by the network:

* The Intelligence Channel provides analytic tools to let companies understand their website visitors.

* The Recommendation Channel makes recommendations for cross-sells and up-sells based on the behavior of previous visitors.

* The Customer Acquisition Channel uses Angara's Converter product to target content to first-time visitors.

* The E-Mail Channel, a strategic partnership with Xchange, Inc., provides clients with the ability to design and target consumer emails.

The move by eCRM firms to embrace ASP offerings is accelerating. In Angara's case an ASP model is a necessity because of the dependence of their solution upon the data they collect. Net Perceptions is one of the very first to move an existing, successful suite to an ASP model. We expect that this will encourage the expansion of the market.

One thing that could hurt this market in the future is a privacy scare. Angara has a good privacy model in that they never get to see information that identifies individuals. One might argue that opting in to receive promotions does not necessarily mean that you want to be identified to a service that tells websites how to serve content to you, but given that the Web is supported by advertising this seems to us like a minimal intrusion, if it is one at all.

Net Perceptions takes no responsibility for the use its customers make of their data; its official policy is "Net Perceptions encourages all of its customers to adopt privacy standards of their own and make those standards freely accessible." We haven't yet seen a privacy policy for the ASP service. We believe that it should contain provisions that each ASP customer's data will be kept isolated from all the other customers, and that data collected through Net Perceptions' applications not become part of the Angara database.

We don't see data merging of this type to be a priori improper - that would depend on the mechanics - but we feel certain that it would ignite the concerns of privacy advocates and the public. Angara assures us that there are in fact no plans for any such data merging.

Enterprise Resource Planning Vendor Gains Connectivity through Acquisition of Plant Intelligence Provider

The acquisition of Lighthammer Software Development Corporation (www.lighthammer.com), a privately-held supplier of enterprise manufacturing intelligence and collaborative manufacturing software, by SAP might indicate that manufacturing operations management (MOM) software systems are becoming mature for consolidation. MOM software is the Web-based collaborative software layer (with the traits of both the integration and analytic applications) for monitoring plant-level trends, establishing operational context, and disseminating plant-level information to the other enterprise-wide constituencies. It is also referred to as enterprise manufacturing intelligence (EMI), manufacturing performance service (MPS), or whichever other acronym some analyst has come up with to make the traditionally not very user-friendly space that includes manufacturing execution systems (MES), plant automation systems, and other plant-centric technologies seem more attractive.

For background information on this acquisition, see The Importance of Plant Level Systems, Multipurpose SAP NetWeaver, and Enterprise Resource Planning Giants Eye the Shop Floor.

In fact, there have been numerous examples of other large plant-centric vendors (including the likes of ABB, Rockwell Automation, General Electric [GE], and Siemens) acquiring an array of companies and products (such as the former Skyva, Systems Modeling, IndX, and Datasweep), thus enabling them to build a broader, integrated, single-source MES scope. SAP's acquisition of Lighthammer might suggest that such manufacturing floor ventures of enterprise applications vendors are more than merely the knee-jerk reaction of a long overdue and much anticipated spending increase in the plant-level software market (see Do Chinese Enterprises Really Need MES and WMS? and The Challenges of Integrating Enterprise Resource Planning and Manufacturing Execution Systems).

Plant floor applications are generally very different from each other, though even their vendors deliver somewhat generic solutions, since continuous flows, discrete piece production rates, temperatures, pressures, and other manufacturing process parameters are common across many manufacturing applications. Still, owing to a dearth of standardized plant-level processes, bundled with a raft of manufacturing styles and industry-specific regulatory compliance (and consequently quality assurance) requirements, user organizations have typically implemented applications on a system-by-system basis. This is in part a response to firefighting requirements defined by department managers, manufacturing engineers, and equipment or process vendors.

This diversity of applications affects one of the major roles of the plant execution system, which is to collect and pool data from the real time processes for delivery to planning level enterprise applications, including enterprise resource planning (ERP) and supply chain management (SCM) systems. This is because, while mainstream ERP vendors have invested in making their products more attractive to specific vertical markets, they cannot really afford to deliver specialized functionality unless there is a large market.
As with its earlier appetizing acquisitions, such as those of TopTier, TopManage, and A2i (see SAP Acquires TopTier to Further Broaden Its Horizons and SAP Bolsters NetWeaver's MDM Capabilities; Part Four: SAP and A2i), the Lighthammer deal should provide SAP with several benefits. For one, the two parties have quite close product DNAs, since Lighthammer has long been a strategic marketing and development partner, with a manufacturing-centric product, which is now delivered as an SAP xApp-like composite application.

Lighthammer, formerly an SAP partner, had worked to create technology integration between its products and the SAP architecture, so reconfiguring Lighthammer as an SAP composite application running on SAP NetWeaver should present no special difficulty. With the acquisition of Lighthammer, SAP gains workflow-based connectivity to virtually any source on the plant floor and analytical functionality with Lighthammer's products for plant intelligence. This meshes well with SAP's recent business intelligence (BI) dashboard forays (see Business Intelligence Corporate Performance Management Market Landscape).

Furthermore, a high percentage (over 85 percent) of Lighthammer's approximately 150 clients are also SAP clients, a fact which should help SAP manage these clients' expectations. In addition, the improved plant-level functionality should make SAP more competitive in non-SAP environments as well. In particular, SAP's existing non-Lighthammer manufacturing clients should benefit, because they should gain greater flexibility in integrating multiple plant floor solutions with SAP. On the flip side of the coin, the vendor has pledged to support existing Lighthammer-only customers for a period of time. However, logically, the value of operating in this mode would decrease if customers are not going to pursue an SAP-centric strategy in the long term.
We concur with AMR Research's finding in the SAP Plus Lighthammer Equals xMII November 2005 report that there are ample opportunities for vendors to amplify xApp Manufacturing Integration and Intelligence (xMII) in terms of data historians or operational data stores, industry-based manufacturing models and KPIs, data mining add-ons to enable proactive, model-based decision making, etc. xMII performance management product functionality is moving in the direction of enhanced alert and event management, knowledge management, real time economics, and directed workflows, which will be key to encapsulating information and capabilities that are needed to make better and faster decisions at multiple levels within manufacturing. Over time, xMII will leverage selected SAP technologies such as the new SAP Visual Composer, which reduces the effort required to develop user interfaces (UI), but will present a challenge to users who have to adapt to the change.

Another weak area that SAP acknowledges is their inability to structurally improve manufacturing processes themselves. This is due to the fact that it is incredibly difficult to map what is happening on the shop floor in detail to, for example, the business systems or the costing systems. It is even more difficult across multiple plants, as the vendor has to provide customers with the ability not only to get the workflows right, but to assemble the data needed to do structural improvements. For this, one would need a plant-level analytic server that could unify data from multiple process control systems into a single contextual database in order to capture, process, and transform real time raw data into intelligible monitoring, counting, and measuring information that could be used by planning and other systems.

The Lighthammer acquisition may compound the above problem. So far, Lighthammer's raison d'�tre has been mostly to provide visibility into disparate plant systems for root cause and analysis, or, to put it another way, merely to take raw data and distill it on the screen. Unfortunately, SAP has never owned the complex data models or analytical tools for process discovery that, somewhat ironically, might provide other vendors that sell plant-focused applications at many levels of solutions for manufacturing and value networks with many opportunities and even allow them to use Lighthammer as the integration toolkit and interface to SAP. These vendors may include Invensys/Wonderware, Rockwell/Datasweep, Camstar, Accumence (formerly Visibility Systems), Visiprise, PEC Info, DSM, Activplant, Informance, OSIsoft, Pavilion Technologies, CIMNET, GE Fanuc, Citect, Siemens, Yokagawa, and PSI-BT, to name only a few.

A further challenge for SAP will be establishing themselves as a trustworthy partner to independent software vendors (ISV) that are afraid of being acquired in order to build the necessary ISV ecosystem (see SAP NetWeaver Background, Direction, and User Recommendations). SAP also has to clarify for potential plant-level ISV partners how to use xMII as an underlying platform for delivering preconfigured industry templates and systems.

Another uncharted area is the proliferation of Lighthammer to the discrete manufacturing industries, since despite having over a hundred joint customers, the focus of this relationship has been predominantly in process manufacturing (e.g., chemicals or life sciences) environments. It makes sense for SAP to have started with the process industries because there was more apparent opportunity. Nonetheless, although the acquisition restricts Lighthammer competitors from further penetrating SAP process manufacturing accounts, the next challenge is for both merging parties to respond to the unique needs of discrete manufacturers for standards-based interoperability and plant-level requirements within the automotive, aerospace, high technology, and other discrete manufacturing industries. In the end, the vendor hopes to achieve the maximum commonality between the two sectors, but that is going to be neither quick nor easy.
Even in light of the acquisition of Lighthammer, and given the natural question of what the acquisition means for other plant-level SAP software partners, SAP maintains that it will remain fully committed to strongly supporting and growing these partner relationships, and that it does not expect that this acquisition will interfere with that. Indeed, SAP may have an industry-wide ethical responsibility to stick to this agreement. Exemplifying this, a group of leading manufacturing companies and software vendors endorsed the Instrumentation, Systems, and Automation Society's (ISA) ISA-95 Enterprise-to-Control System Integration standards and World Batch Forum's (WBF) business to manufacturing markup language (B2MML) at a recent plant-to-business (P2B) interoperability workshop hosted by SAP and ARC Research. Workshop attendees also discussed the establishment of an open vendor and user consortium to share knowledge and best practices for plant floor to business integration and to provide compliance certification for use of B2MML and related standards. In addition to SAP and ARC, participants included representatives from Apriso, Arla Foods, Datasweep, Dow Corning Corporation, DuPont Engineering, Eli Lilly, Emerson Process Management, Empresas Polar S.A., GE Fanuc, General Mills, Invensys-Wonderware, LightHammer, MPDV, MPR de Venezuela, OSIsoft, Procter and Gamble, PSI Soft, Rockwell Automation, Rohm and Haas, SAB Miller, Siemens, and Yokogawa, as well as representatives from ISA and WBF. Ever since this endorsement, the progress in terms of leveraging ISA-95 as a standard and the WBF's B2MML as an appropriate schema for the process industries has been remarkable.

Similarly, two years ago or so, in response to growing customer need, SAP announced the industry wide "manufacturing interoperability" initiative, the aim of which was to dramatically reduce enterprise-to-production systems integration costs using available industry standards. For more details see Multipurpose SAP NetWeaver.

SAP is not to blame for having planning solutions that, in some cases, may extend deep into the plant floor, since the lack of integration is in part because the involved parties in the software industry have tacitly agreed to divide the software world into disjointed sets of vendors—the automation vendors, the MES vendors, and the ERP or enterprise applications vendors. Viewing things with this old and outgoing mindset has brought many endeavors to a halt, due to the question of where the line between MES and ERP is. As discussed in The Challenges of Integrating Enterprise Resource Planning and Manufacturing Execution Systems, the answer is often that there is no one clear line.

For example, due to the notion of "plant-centric ERP" in the 1990s, vendors, such as the former Marcam in process manufacturing and Baan in discrete manufacturing, had deep manufacturing models that challenged the artificial boundaries between ERP and MES. Along the same lines, Oracle plans to add more built-in plant-level functionality in the upcoming Oracle e-Business Suite Release 12, precluding the need for the typically extensive and painful customization outside of Oracle's toolset. Even smaller ERP vendors have been adding industry-specific functionality, for example Ross Systems (now part of CDC Software) for pharmaceutical and life sciences companies, IQMS for plastic processors, and Plexus Systems for automotive customers.

When it comes to SAP, it has a lot of customers that use SAP functionality to tie directly into low-level shop floor systems. However, there are also SAP customers that at the same time—at another site or division—use SAP in conjunction with an MES system. In the end, SAP will likely compete in the marketplace where it feels its functionality is competitive enough compared to other solutions. But the vendor acknowledges that customers have, for good reasons, installed other solutions, and will continue to do so. Thus, in order to be a trusted platform provider, it will want to be able to integrate with those systems.

Has SAP Nailed Plant Level Leadership with Lighthammer

At the end of June, SAP announced that it was delivering enhanced connectivity between the plant floor and the enterprise by acquiring Lighthammer Software Development Corporation (www.lighthammer.com), a privately-held supplier of enterprise manufacturing intelligence and collaborative manufacturing software, based in Exton, Pennsylvania (US). Lighthammer and SAP shared a vision of adaptive business networks (ABN), as illustrated by their longstanding partnership, during which Lighthammer was a premier "SAP Powered by NetWeaver" and SAP xApps partner. The company's approximately sixty employees have reportedly remained in their current facilities, and have become a part of SAP America and SAP Labs. Mufson Howe Hunter & Company LLC, a Philadelphia, Pennsylvania (US)-based investment bank, served as financial advisor to Lighthammer on this transaction.

At the time of the announcement, the two merging parties and formerly close partners believed that the acquisition would deliver value through improved manufacturing performance with more rapid time-to-value for SAP's installed base of more than 12,000 manufacturing customers. Lighthammer's Collaborative Manufacturing Suite (CMS), currently used by hundreds of companies worldwide, including more than 100 Fortune 500 manufacturing companies, was to be delivered as an SAP xApps composite application on the SAP NetWeaver platform, so as to provide enterprises with what SAP refers to as adaptive manufacturing (i.e., the ability of a manufacturer to profitably replenish the supply chain while dynamically adapting to unpredictable change). For background information on this acquisition, see The Importance of Plant Level Systems, Multipurpose SAP NetWeaver, and Enterprise Resource Planning Giants Eye the Shop Floor.

Lighthammer CMS has been re-branded as SAP xApp Manufacturing Integration and Intelligence (SAP xMII). Built on a modern, service oriented architecture (SOA)-based foundation, the former Lighthammer CMS provided a broad set of services that were required to relatively quickly assemble operational excellence applications in the areas of performance management, continuous improvement, and operational synchronization. The initial version of xMII is basically the former Lighthammer software, re-released in accordance with SAP software production methodology. Moving forward, the xMII team charter will be to help SAP manufacturing customers achieve better business performance through the synchronization of operations with business functions and continuous improvement. This translates into packaged manufacturing integration and intelligence solutions targeted for real time performance measurement. On the integration front, xMII will maintain a considerable degree of autonomy, but will also be closely associated with SAP NetWeaver, running on the SAP NetWeaver Web Application Server (WAS). This is because autonomy is required to match the unique product needs of manufacturing operations that are non-SAP shops or are driven by limited on-site information technology (IT) resources and skills, both of which can be an obstacle to leveraging the complex NetWeaver stack.

The SAP xMII solution will provide near real time visibility to manufacturing exceptions and performance variances, including root causes and business impacts. This will enable manufacturers and their production personnel to better adapt to change and to more rapidly respond to unforeseen demand and supply events. In addition, this combination reportedly will permit SAP to deliver real time transactional integration between enterprise resource planning (ERP) and plant floor systems. Another potential benefit will be the ability to provide unified, real time analytics and visualization, often referred to as manufacturing intelligence or plant intelligence, out-of-the box to manufacturing customers. Moreover, with the xMII solution, SAP is also aiming to enable user companies to leverage their current investments at a lower total cost of ownership (TCO). For more information, see Plant Intelligence as Glue for Dispersed Data?.

Using the Instrumentation, Systems, and Automation (ISA)-95 standards for process manufacturing interoperability (an emerging standard for interfacing low level industrial control level [ICL] code to business applications, which aims to further reduce the complexity of building custom connections to shop floor systems and thereby accelerate the time-to-value for the end customer), the Lighthammer and SAP manufacturing solution will exchange data and render them through SAP manufacturing intelligence dashboards, in order to deliver actionable intelligence in the form of alerts, reports, key performance indicators (KPI), and decision support to production personnel for right-time decision making (see Manufacturer's Nirvana—Real Time Actionable Information and SAP NetWeaver Background, Direction, and User Recommendations). The combined solution will thus allow production personnel to identify deviations in real time, provide drill-downs so as to understand the business and financial impact of the exceptions to be managed, and display the workflows so as to resolve them relatively rapidly and cost-effectively. The aim, of course, is improved productivity.
One idea that has been gaining in popularity lately is the inclusion of a value-adding process layer that can fairly easily link to scattered data sources, retrieve specific data, perform process logic, and deliver a meaningful output. Companies are applying manufacturing (plant) intelligence systems, such as the one supplied by Lighthammer, to aggregate appropriate information from plant-focused data sources into a meaningful context for presentation and analysis. These systems are a combination of integration or middleware platforms and business intelligence (BI) applications, since portals can aggregate and process manufacturing data for specific user communities, and then can share scheduling information across collaborative value chains. On the other hand, manufacturing intelligence systems can collect specific data from plant-focused devices and systems, and then analyze and present the information in dashboards and other KPI tracking systems. For more information, see Plant Intelligence as Glue for Dispersed Data?.

Integral to Lighthammer is the concept of non-intrusive connectivity, allowing legacy data sources to be integrated into the overall enterprise decision support scheme with minimal effort and no disruption to operations. The product's connectivity is not limited to data sources, as it can deliver information to a broad range of Web devices, including all major browsers, handheld or palmtop devices, Web phones, and enterprise applications. The visualization functionality includes a variety of charting components, support for wireless devices, and a set of wizards for automatic generation of Web page content for users with little or no technical expertise. There is also an animation editor in the Lighthammer technology that enables users to animate objects. For instance, one might want to be able to see a vessel actually filling up and see the level changing.

A comprehensive reporting module allows content from multiple data sources to be aggregated and correlated in a single report, which can be either "live" or static, and displayed in a browser, printed, or disseminated via e-mail. For some time, the product also has provided an "enterprise" option for multi-site views of production and manufacturing operations. This option enables multiple Illuminator (a core component of the former Lighthammer CMS suite that features solid extract, transform, and load [ETL] capabilities) servers throughout the business to provide a single, unified view of enterprise information. This allows, for example, a corporate process engineer to assist plants with process problems, or a production executive to view real time manufacturing results at a number of sites from a single web browser.

The Lighthammer technology connects to the three areas that users need connection to.

1. It connects to the main SAP modules.
2. It connects to the dashboard, so that users have KPIs coming out of both the SAP environment and the manufacturing systems.
3. It connects to a BI platform, which is useful as the data warehouse (i.e., SAP BW) environment is an important source of information. For example, a customer might want to capture information about reason codes for failure, so that when things are not made as they are supposed to be, all that information is captured in a data warehouse.

The problem is that, while information comes from production operations, goes to a data warehouse, and is viewed by the business, the very people who fed the information in typically do not see the data. In fact, because of a ripple-up effect of failures into the business down on the shop floor, sharing information through the manufacturing intelligence dashboards out of the BI layer can be as valuable in some cases as getting the information from the production level. For this reason, Lighthammer touts its ability to enable manufacturing in an adaptive environment by providing the business context for manufacturing data on an event-based integration in order to close this loop between the business and production levels.
At some SAP events, the two formerly independent partner vendors related a scenario-based example that was modeled around a paint process, which had both process industry characteristics (e.g., using reactors and vessels that handle liquids and fluids) and consumer packaged goods (CPG) industry characteristics (in that material is packaged and ultimately put in a warehouse or on a shelf).

The process that the SAP and Lighthammer teams have developed starts with material being added to a mixing and reaction process, whereby the product is extracted from the reaction, and then filtered, dried, and placed as an intermediate in cans. This particular process is also applicable to the pharmaceutical industry. The product is then packaged, palletized, labeled, and shipped to a distribution center, where quality tests are performed and the ISA-95 integration standard is employed to exchange schedule and performance data between the ERP and plant-level applications. To eliminate any latency or lack of synchronization, the production plan update and associated master data are automatically transmitted to the plant floor via SAP XI using the ISA-95 integration standard. The production plan synchronizes the plant systems, so that performance data, including status costs and quality information, are fed back into SAP in real time.

To be precise, the production schedule is sent from mySAP ERP to Lighthammer CMS (now SAP xMII), transmitted to the automated system, and then displayed on the manufacturing dashboard. After the batch is executed, Lighthammer aggregates production performance data and automatically updates the mySAP ERP inventory. Needless to say, the solution also tracks how things are developing throughout the batch, capturing not only the start and end points of a batch, but it also the intermediate ones. Thus, based on the sensitive data its captures as the batches are being manufactured and on some Six Sigma control analysis, Lighthammer technology detects quality problems, generates alerts, and quarantines the batches in mySAP ERP.

Quarantining a batch based on an anomaly in the process is the epitome of a closed-loop behavior. Production quality alerts appear in the dashboard, and the production supervisor can then drill down into the alert to perform a rapid root cause analysis. At this point, it is important to have not only the visibility to stop or change the process, but an understanding of why this problem has occurred so as to prevent it from reoccurring. The final stage would thus be the production supervisor initiating a corrective action to fix the problem, resolving the exception before it becomes a customer issue in an effort to have a continuous improvement environment.

Another often presented scenario leverages radio-frequency identification (RFID) technology. In this scenario, one might have paint cans containing a certain color or a certain blend that are moving more quickly than others. RFID-enabled business processes would indicate the pattern of these cans on the floor. In addition, the notification of material available for shipping would occur automatically and immediately. What one would like to be able to do is to respond at the manufacturing level to this change on "now what?" basis. For example, the sales department might want to rapidly capitalize on this opportunity. In this scenario, the production plan can be re-aligned in real time, based on the actual capability to deliver or the capability to promise (CTP), and the transient opportunity can be successfully realized since one has the ability to respond.

With the above scenario, we are talking once again about a closed-loop application, whereby Lighthammer receives the schedule and master data from SAP, and Lighthammer in turn uses SAP XI to deliver real time alerts and KPIs to the SAP dashboard. The dashboard itself is a composite application consisting of the XI views, the KPIs, and any accompanying alerts. There might be alerts coming out of the SAP environment and out of the external plant level systems as well. In which case, Lighthammer would be monitoring conditions, calculating KPIs, and further applying execution logic.

The possible value of this for customers could be multifold. First of all, it is a closed-loop system with real time synchronization—when a plant manager is looking at data from Lighthammer on his or her dashboard, it is live data. Moreover, users have control over how often the data is sent to the screen, which is done automatically in the background. The business implications of quality performance and delivery issues on the shop floor are thereby quantified and made visible, while proactive exception detection is supported to minimize the overall supply chain impact. In addition, production personnel are empowered with a productivity tool that enables them to access all the relevant documentation on one single dashboard or system, in order to manage by exception, leverage the dashboard as a decision support environment, perform tasks assisted by automated workflows, and initiate improvements and monitor their impact with the KPI dashboard.
In May 2005, almost immediately before the acquisition, Lighthammer unveiled CMS 11.0, which was a major upgrade of the flagship product, featuring enhanced scalability, multisite metrics, security, and traceability for regulatory compliance of the composite platform for building manufacturing intelligence applications. The new release also added features that extended the development environment's existing performance management, continuous improvement, and operational synchronization capabilities, which were SOA-based. Importantly, the new capabilities aimed at helping developers to more easily build and deploy applications that can be accessed across the distributed manufacturing enterprise. At least 60 percent of the code in version 11, which had been under development for about a year, had reportedly been rewritten. For end users, this might mean about 15 percent more functionality and a complete upward compatibility with existing applications.

Among the most significant enhancements to version 11.0 was the Security Manager service, which added unified user management and single sign-on capabilities for run-time applications. This means users will be able to access any CMS-built application regardless of the platform on which it runs. Therefore, CMS, which previously operated only on Microsoft Windows-based systems, can now run on other operating environments, such as Linux, Sun Solaris, and HP-UX. The service also allows integration with a wide range of third-party authentication systems, including SAP, lightweight directory access protocol (LDAP) using Active Directory, security assertion markup language (SAML), Windows Domains, Kerberos from Massachusetts Institute of Technology (MIT), and others. These features should allow customers to manage user roles, memberships, and attributes better, as well as to define authentication or authorization services either from existing enterprise user management directories or through the Lighthammer application. This service should thus provide the ability to implement a security strategy that could fit virtually any existing enterprise architecture and should extend "single sign-on" into the domain of plant applications, improving compliance.

Additional compliance and traceability features that were added include an electronic signature service and a multilevel confirmation or challenge capability, which securely controls and documents user actions for regulatory compliance with 21 Code of Federal Regulations Part 11 (21 CFR 11), Sarbanes-Oxley (SOX), and other regulations. The enterprise application integration (EAI) capabilities have also been enhanced, with the addition of new business logic capabilities that take advantage of Web services in SAP NetWeaver to simplify data integration between plant systems and enterprise systems. Last but not least, the generally available CMS version 11.0 laid the groundwork for another upgrade set. Currently, the product is built mostly on Java, but the logic engine is based on Microsoft .NET. The next release, however, will be 100 percent Java-based, which should give customers a much broader choice of development platform.

Lighthammer's process manufacturing industry expertise and foresight in developing intelligent manufacturing middleware was helped by its early commitment to open technologies like the ISA-95 standard, Java, extensible markup language (XML), and SOA. Even earlier releases featured Lighthammer's leadership in the deployment of these open technologies as enablers for acquisition, analysis, distribution, and presentation of information from manufacturing systems. Lighthammer CMS functionality has long included built-in transformation of data into any standard XML message structure, such as Microsoft BizTalk, RosettaNet, and others, as well as the ability to interface with peer plant-level or enterprise-level systems using XML as the default data format for both incoming and outgoing data. Back in 2001, Illuminator 8.5 introduced a breakthrough Intelligent Agent subsystem, which could be used to enable inter-application messaging upon detection of production events or exceptions; automated calculation of KPI metrics; automatic transfer of information between XML, database, and e-mail sources; the gathering and conversion of data from external Web sources; and much more.


Free Blogger Templates by Isnaini Dot Com and Bridal Dresses. Powered by Blogger