Tuesday, October 20, 2009

Analysis of SAS Institute and IBM Intelligence Alliance

"At IBM's PartnerWorld 2000 in San Diego this Monday (24Jan00), SAS Institute and IBM will announce a new business intelligence relationship that will include the formation of consulting practices focused on SAS solutions, and further development of e-business intelligence solutions that integrate IBM's DB2 database product family and SAS software.

The announcement between the two business intelligence leaders is the latest in a select group of key strategic relationships forged by IBM as it refocuses its partnering efforts to provide world-class e-business applications. Recent announcements have included partnerships with other leading software providers such as Siebel Systems and SAP AG.

The agreement between IBM and SAS Institute and the planned joint development efforts will result in:

* Creation of a consulting practice in IBM Global Services specializing in SAS solutions. These consultants will work with joint customers to integrate the powerful decision support capabilities of SAS solutions with existing transaction systems and other e-business applications.

* Closer integration of SAS solutions and DB2 Universal Database to enhance performance for all IBM server platforms, including Netfinity, AS/400, RS/6000, NUMA-Q and S/390.

* IBM Global Services' access to a wide range of SAS Institute solutions for business intelligence, data warehousing, and decision support.

The relationship will initially focus on three primary areas where IBM and SAS Institute will offer end-to-end solutions to enterprise customers. IBM Global Services will provide the analytical services, systems integration and industry-specific consulting expertise. SAS Institute will provide software solutions for Customer Relationship Management (CRM), Enterprise Resource Planning (ERP), and Supplier Relationship Management (SRM). IBM and SAS Institute plan to more tightly integrate and thus enhance performance of DB2 Universal Database and SAS software."
The existing customer base for IBM DB2 Universal Database should be strongly interested in this development. The ability to access solutions for customer relationship management and extended supply chain solutions should be especially intriguing. We believe that the combination of SAS's strong business intelligence solutions and IBM's global sales and consulting forces will make a powerful combination. The question for customers will be whether this is just a marketing alliance or an actual combination of powerful products at the code level, allowing customers to seamlessly integrate the products.

Using Predictive Analytics within Business Intelligence: A Primer

Predictive analytics has helped drive business intelligence (BI) towards business performance management (BPM). Traditionally, predictive analytics and models have been used to identify patterns in consumer oriented businesses, such as identifying potential credit risk when issuing credit cards, or analyzing the buying habits of retail consumers. The BI industry has shifted from identifying and comparing data patterns over time (based on batch processing of monthly or weekly data) to providing performance management solutions with right-time data loads in order to allow accurate decision making in real time. Thus, the emergence of predictive analytics within BI has become an extension of general performance management functionality. For organizations to compete in the market place, taking a forward-looking approach is essential. BI can provide the framework for organizations focused on driving their business based on predictive models and other aspects of performance management.

We'll define predictive analytics and identify its different applications inside and outside BI. We'll also look at the components of predictive analytics and its evolution from data mining, and at how they interrelate. Finally, we'll examine the use of predictive analytics and how they can be leveraged to drive performance management.

Overview of Analytics and Their General Business Application

Analytical tools enable greater transparency within an organization, and can identify and analyze past and present trends, as well as discover the hidden nature of data. However, past and present trend analysis and identification alone are not enough to gain competitive advantage. Organizations need to identify future patterns, trends, and customer behavior to better understand and anticipate their markets.

Traditional analytical tools claim to have a 360-degree view of the organization, but they actually only analyze historical data, which may be stale, incomplete, or corrupted. Traditional analytics can help gain insight based on past decision making, which can be beneficial; however, predictive analytics allows organizations to take a forward-looking approach to the same types of analytical capabilities.

Credit card providers offer a first-rate example of the application of analytics (specifically, predictive analytics) in their identification of credit card risk, customer retention, and loyalty programs. Credit card companies attempt to retain their existing customers through loyalty programs, and need to take into account the factors that cause customers to choose other credit card providers. The challenge is predicting customer loss. In this case, a model which uses three predictors can be used to help predict customer loyalty: frequency of use, personal financial situations, and lower annual percentage rate (APR) offered by competitors. The combination of these predictors can be used to create a predictive model. The predictive model can then be applied and customers can be put into categories based on the resulting data. Any changes in user classification will flag the customer. That customer will then be targeted for the loyalty program. Financial institutions, on the other hand, use predictive analytics to identify the lifetime value of their customers. Whether this translates into increased benefits, lower interest rates, or other benefits for the customer, classifying and applying patterns to different customer segmentations allows the financial institutions to best benefit from (and provide benefit to) their customers.
Data mining can be defined as an analytical tool set that searches for data patterns automatically and identifies specific patterns within large datasets across disparate organizational systems. Data mining, text mining, and Web mining are types of pattern identification. Organizations can use these forms of pattern recognition to identify customers' buying patterns or the relationship between a person's financial records and their credit risk. Predictive analytics moves one step further and applies these patterns to make forward-looking predictions. Instead of just identifying a potential credit risk, an organization can identify the lifetime value of a customer by developing predictive decision models and applying these models to the identified patterns. These types of pattern identification and forward-looking model structures can equally be applied to BI and performance management solutions within an organization.

Predictive analytics is used to determine the probable future outcome of an event, or the likelihood of a situation occurring. It is the branch of data mining concerned with the prediction of future probabilities and trends. Predictive analytics is used to analyze automatically large amounts of data with different variables, including clustering, decision trees, market basket analysis, regression modeling, neural nets, genetic algorithms, text mining, hypothesis testing, decision analytics, and so on.

The core element of predictive analytics is the predictor, a variable that can be measured for an individual or entity to predict future behavior. These predictors are based on models that are created to use the analytical capabilities within the generated predictive models. Descriptive models classify relationships by identifying customers or prospective customers, and placing them in groups based on identified criteria. Decision models consider business and economic drivers and constraints that surpass the general functionality of a predictive model. In a sense, statistical analysis helps to drive this process as well. The predictors are the factors that help identify the outcomes of the actual model. For example, a financial institution may want to identify the factors that make a valuable lifetime customer.

Multiple predictors can be combined into a predictive model, which, when subjected to analysis, can be used to forecast future probabilities with an acceptable level of reliability. In predictive modeling, data is collected, a statistical model is formulated, predictions are made, and the model is validated (or revised) as additional data becomes available. One of the main differences between data mining and predictive analytics is that data mining can be a fully automated process, whereas predictive analytics requires an analyst to identify the predictors and apply them to the defined models.

A decision tree is a variable within predictive analytics that allows the user to visualize the mapping of observations about an item and compare it to conclusions about the item's target value. Basically, decision trees are built by creating a hierarchy of predictor attributes. The highest level represents the outcome, and each sub-level identifies another factor in that conclusion. This can be compared to if-else statements, which identify a result based on whether certain factors meet specified criteria. For example, in order to assess potential bad debt based on credit history, salary, demographics, and so on, a financial institution may wish to identify multiple scenarios, each of which is likely to meet bad debt customer criteria, and use combinations of those scenarios to identify which customers are most likely to become bad debt accounts.

Regression analysis is another component of predictive analytics that allows users to model relationships between three or more variables in order to predict the value of one variable in comparison to the values of the others. It can be used to identify buying patterns based on multiple demographic qualifiers such as age and gender which can be beneficial to identify where to sell specific products. Within BI, this is beneficial when used with scorecards that focus on geography and sales.
Practical applications of all of these analytical models allow organizations to forecast results to predict financial outcomes, hopefully increasing revenues in the process. Within BI, aside from financial outcomes, predictive analytics can be used to develop corporate strategies throughout the organization. What-if analyses can be performed to leverage the capabilities of predictive analytics to build various scenarios, allowing organizations to map out a series of outcomes of strategic and tactical plans. This way, organizations can implement the best strategy based on the scenario creation.

How Predictive Analytics Are Used within BI, and How They Drive an Organization's BPM

Data mining, predictive analytics, and statistical engines are examples of tools that have been embedded in BI software packages to leverage the benefits of performance management. If BI is backward looking, and data mining identifies the here and now, predictive analytics and their use within performance management is the looking glass into the future. This forward-looking view helps organizations drive their decision making. BI is known for its consolidation of data from disparate business units, and for its analysis capabilities based on that consolidated data. Performance management goes one step further by leveraging the BI framework (such as the data warehousing structure and extract, transform, and load [ETL] capabilities) to monitor performance, identify trends, and allow decision makers the ability to set appropriate metrics and monitor results on an ongoing basis.

With predictive analytics embedded within the above processes, the metrics set and business rules identified by organizations can be used to identify the predictors that need to be evaluated. These predictors can then be used to shift towards a forward-looking approach in decision making by using the strengths from the areas identified above. Scorecards are one example of a performance management tool that can leverage predictive analytics. The identification of sales performance by region, product type, and demographics can be used to define what new products should be introduced into the market, and where. In general, scorecards can graphically reflect the selected sales information and create what-if scenarios based on the data identified to verify the right combinations of new product distribution.

What-if scenarios can be used within the different visualization tools to create business models that anticipate what might happen within an organization based on changes in defined variables. What-if analysis gives organizations the tools to identify how profits will be affected based on changes in inflation and pricing patterns as well as the impact of increasing the number of employees throughout the organization. Online analytical processing (OLAP) cubes can be created to identify dimensional data, and patterns within changing dimensions can be compared over time to contrast scenarios using a cube structure to automatically view the outcome of the what-if scenarios.

Marketing and Intelligence, Together at Last

Angara offers an ASP-based service for targeting web site content to unidentified visitors (see article, "Getting Strangers to Take Your Candy"). The company buys online profile data from other websites. These are data that users agree to provide in exchange for receiving newsletters or other offers or are captured from clickstreams by online advertising networks such as MatchLogic.

By arrangement with the websites Angara gets to drop a cookie - but not any data that might identify the user as an individual. When the user later visits an Angara customer, Angara can provide segmentation information such as age, sex, or geographic region. The customer's website uses the segmentation information to serve targeted content to the visitor.

In the case of data from ad agencies, Angara is given access to the cookies dropped by the agencies. In both cases the data only identify broad characteristics of the user, such as sex, interests and responses to categories of advertising. The goal is to make first time visitors more likely to make purchases or return to the site.

Net Perceptions specializes in analysis of existing customers. The company sifts through data on the viewing and purchasing behavior of shoppers and uses its conclusions to make recommendations for personalized offers and targeted ads. Net Perceptions has chosen Angara to complement its own offerings in its new ASP offering, called the Net Perceptions Personalization Network.

The Personalization Network will offer four "channels," each driven from databases compiled by the network:

* The Intelligence Channel provides analytic tools to let companies understand their website visitors.

* The Recommendation Channel makes recommendations for cross-sells and up-sells based on the behavior of previous visitors.

* The Customer Acquisition Channel uses Angara's Converter product to target content to first-time visitors.

* The E-Mail Channel, a strategic partnership with Xchange, Inc., provides clients with the ability to design and target consumer emails.

The move by eCRM firms to embrace ASP offerings is accelerating. In Angara's case an ASP model is a necessity because of the dependence of their solution upon the data they collect. Net Perceptions is one of the very first to move an existing, successful suite to an ASP model. We expect that this will encourage the expansion of the market.

One thing that could hurt this market in the future is a privacy scare. Angara has a good privacy model in that they never get to see information that identifies individuals. One might argue that opting in to receive promotions does not necessarily mean that you want to be identified to a service that tells websites how to serve content to you, but given that the Web is supported by advertising this seems to us like a minimal intrusion, if it is one at all.

Net Perceptions takes no responsibility for the use its customers make of their data; its official policy is "Net Perceptions encourages all of its customers to adopt privacy standards of their own and make those standards freely accessible." We haven't yet seen a privacy policy for the ASP service. We believe that it should contain provisions that each ASP customer's data will be kept isolated from all the other customers, and that data collected through Net Perceptions' applications not become part of the Angara database.

We don't see data merging of this type to be a priori improper - that would depend on the mechanics - but we feel certain that it would ignite the concerns of privacy advocates and the public. Angara assures us that there are in fact no plans for any such data merging.

Enterprise Resource Planning Vendor Gains Connectivity through Acquisition of Plant Intelligence Provider

The acquisition of Lighthammer Software Development Corporation (www.lighthammer.com), a privately-held supplier of enterprise manufacturing intelligence and collaborative manufacturing software, by SAP might indicate that manufacturing operations management (MOM) software systems are becoming mature for consolidation. MOM software is the Web-based collaborative software layer (with the traits of both the integration and analytic applications) for monitoring plant-level trends, establishing operational context, and disseminating plant-level information to the other enterprise-wide constituencies. It is also referred to as enterprise manufacturing intelligence (EMI), manufacturing performance service (MPS), or whichever other acronym some analyst has come up with to make the traditionally not very user-friendly space that includes manufacturing execution systems (MES), plant automation systems, and other plant-centric technologies seem more attractive.

For background information on this acquisition, see The Importance of Plant Level Systems, Multipurpose SAP NetWeaver, and Enterprise Resource Planning Giants Eye the Shop Floor.

In fact, there have been numerous examples of other large plant-centric vendors (including the likes of ABB, Rockwell Automation, General Electric [GE], and Siemens) acquiring an array of companies and products (such as the former Skyva, Systems Modeling, IndX, and Datasweep), thus enabling them to build a broader, integrated, single-source MES scope. SAP's acquisition of Lighthammer might suggest that such manufacturing floor ventures of enterprise applications vendors are more than merely the knee-jerk reaction of a long overdue and much anticipated spending increase in the plant-level software market (see Do Chinese Enterprises Really Need MES and WMS? and The Challenges of Integrating Enterprise Resource Planning and Manufacturing Execution Systems).

Plant floor applications are generally very different from each other, though even their vendors deliver somewhat generic solutions, since continuous flows, discrete piece production rates, temperatures, pressures, and other manufacturing process parameters are common across many manufacturing applications. Still, owing to a dearth of standardized plant-level processes, bundled with a raft of manufacturing styles and industry-specific regulatory compliance (and consequently quality assurance) requirements, user organizations have typically implemented applications on a system-by-system basis. This is in part a response to firefighting requirements defined by department managers, manufacturing engineers, and equipment or process vendors.

This diversity of applications affects one of the major roles of the plant execution system, which is to collect and pool data from the real time processes for delivery to planning level enterprise applications, including enterprise resource planning (ERP) and supply chain management (SCM) systems. This is because, while mainstream ERP vendors have invested in making their products more attractive to specific vertical markets, they cannot really afford to deliver specialized functionality unless there is a large market.
As with its earlier appetizing acquisitions, such as those of TopTier, TopManage, and A2i (see SAP Acquires TopTier to Further Broaden Its Horizons and SAP Bolsters NetWeaver's MDM Capabilities; Part Four: SAP and A2i), the Lighthammer deal should provide SAP with several benefits. For one, the two parties have quite close product DNAs, since Lighthammer has long been a strategic marketing and development partner, with a manufacturing-centric product, which is now delivered as an SAP xApp-like composite application.

Lighthammer, formerly an SAP partner, had worked to create technology integration between its products and the SAP architecture, so reconfiguring Lighthammer as an SAP composite application running on SAP NetWeaver should present no special difficulty. With the acquisition of Lighthammer, SAP gains workflow-based connectivity to virtually any source on the plant floor and analytical functionality with Lighthammer's products for plant intelligence. This meshes well with SAP's recent business intelligence (BI) dashboard forays (see Business Intelligence Corporate Performance Management Market Landscape).

Furthermore, a high percentage (over 85 percent) of Lighthammer's approximately 150 clients are also SAP clients, a fact which should help SAP manage these clients' expectations. In addition, the improved plant-level functionality should make SAP more competitive in non-SAP environments as well. In particular, SAP's existing non-Lighthammer manufacturing clients should benefit, because they should gain greater flexibility in integrating multiple plant floor solutions with SAP. On the flip side of the coin, the vendor has pledged to support existing Lighthammer-only customers for a period of time. However, logically, the value of operating in this mode would decrease if customers are not going to pursue an SAP-centric strategy in the long term.
We concur with AMR Research's finding in the SAP Plus Lighthammer Equals xMII November 2005 report that there are ample opportunities for vendors to amplify xApp Manufacturing Integration and Intelligence (xMII) in terms of data historians or operational data stores, industry-based manufacturing models and KPIs, data mining add-ons to enable proactive, model-based decision making, etc. xMII performance management product functionality is moving in the direction of enhanced alert and event management, knowledge management, real time economics, and directed workflows, which will be key to encapsulating information and capabilities that are needed to make better and faster decisions at multiple levels within manufacturing. Over time, xMII will leverage selected SAP technologies such as the new SAP Visual Composer, which reduces the effort required to develop user interfaces (UI), but will present a challenge to users who have to adapt to the change.

Another weak area that SAP acknowledges is their inability to structurally improve manufacturing processes themselves. This is due to the fact that it is incredibly difficult to map what is happening on the shop floor in detail to, for example, the business systems or the costing systems. It is even more difficult across multiple plants, as the vendor has to provide customers with the ability not only to get the workflows right, but to assemble the data needed to do structural improvements. For this, one would need a plant-level analytic server that could unify data from multiple process control systems into a single contextual database in order to capture, process, and transform real time raw data into intelligible monitoring, counting, and measuring information that could be used by planning and other systems.

The Lighthammer acquisition may compound the above problem. So far, Lighthammer's raison d'�tre has been mostly to provide visibility into disparate plant systems for root cause and analysis, or, to put it another way, merely to take raw data and distill it on the screen. Unfortunately, SAP has never owned the complex data models or analytical tools for process discovery that, somewhat ironically, might provide other vendors that sell plant-focused applications at many levels of solutions for manufacturing and value networks with many opportunities and even allow them to use Lighthammer as the integration toolkit and interface to SAP. These vendors may include Invensys/Wonderware, Rockwell/Datasweep, Camstar, Accumence (formerly Visibility Systems), Visiprise, PEC Info, DSM, Activplant, Informance, OSIsoft, Pavilion Technologies, CIMNET, GE Fanuc, Citect, Siemens, Yokagawa, and PSI-BT, to name only a few.

A further challenge for SAP will be establishing themselves as a trustworthy partner to independent software vendors (ISV) that are afraid of being acquired in order to build the necessary ISV ecosystem (see SAP NetWeaver Background, Direction, and User Recommendations). SAP also has to clarify for potential plant-level ISV partners how to use xMII as an underlying platform for delivering preconfigured industry templates and systems.

Another uncharted area is the proliferation of Lighthammer to the discrete manufacturing industries, since despite having over a hundred joint customers, the focus of this relationship has been predominantly in process manufacturing (e.g., chemicals or life sciences) environments. It makes sense for SAP to have started with the process industries because there was more apparent opportunity. Nonetheless, although the acquisition restricts Lighthammer competitors from further penetrating SAP process manufacturing accounts, the next challenge is for both merging parties to respond to the unique needs of discrete manufacturers for standards-based interoperability and plant-level requirements within the automotive, aerospace, high technology, and other discrete manufacturing industries. In the end, the vendor hopes to achieve the maximum commonality between the two sectors, but that is going to be neither quick nor easy.
Even in light of the acquisition of Lighthammer, and given the natural question of what the acquisition means for other plant-level SAP software partners, SAP maintains that it will remain fully committed to strongly supporting and growing these partner relationships, and that it does not expect that this acquisition will interfere with that. Indeed, SAP may have an industry-wide ethical responsibility to stick to this agreement. Exemplifying this, a group of leading manufacturing companies and software vendors endorsed the Instrumentation, Systems, and Automation Society's (ISA) ISA-95 Enterprise-to-Control System Integration standards and World Batch Forum's (WBF) business to manufacturing markup language (B2MML) at a recent plant-to-business (P2B) interoperability workshop hosted by SAP and ARC Research. Workshop attendees also discussed the establishment of an open vendor and user consortium to share knowledge and best practices for plant floor to business integration and to provide compliance certification for use of B2MML and related standards. In addition to SAP and ARC, participants included representatives from Apriso, Arla Foods, Datasweep, Dow Corning Corporation, DuPont Engineering, Eli Lilly, Emerson Process Management, Empresas Polar S.A., GE Fanuc, General Mills, Invensys-Wonderware, LightHammer, MPDV, MPR de Venezuela, OSIsoft, Procter and Gamble, PSI Soft, Rockwell Automation, Rohm and Haas, SAB Miller, Siemens, and Yokogawa, as well as representatives from ISA and WBF. Ever since this endorsement, the progress in terms of leveraging ISA-95 as a standard and the WBF's B2MML as an appropriate schema for the process industries has been remarkable.

Similarly, two years ago or so, in response to growing customer need, SAP announced the industry wide "manufacturing interoperability" initiative, the aim of which was to dramatically reduce enterprise-to-production systems integration costs using available industry standards. For more details see Multipurpose SAP NetWeaver.

SAP is not to blame for having planning solutions that, in some cases, may extend deep into the plant floor, since the lack of integration is in part because the involved parties in the software industry have tacitly agreed to divide the software world into disjointed sets of vendors—the automation vendors, the MES vendors, and the ERP or enterprise applications vendors. Viewing things with this old and outgoing mindset has brought many endeavors to a halt, due to the question of where the line between MES and ERP is. As discussed in The Challenges of Integrating Enterprise Resource Planning and Manufacturing Execution Systems, the answer is often that there is no one clear line.

For example, due to the notion of "plant-centric ERP" in the 1990s, vendors, such as the former Marcam in process manufacturing and Baan in discrete manufacturing, had deep manufacturing models that challenged the artificial boundaries between ERP and MES. Along the same lines, Oracle plans to add more built-in plant-level functionality in the upcoming Oracle e-Business Suite Release 12, precluding the need for the typically extensive and painful customization outside of Oracle's toolset. Even smaller ERP vendors have been adding industry-specific functionality, for example Ross Systems (now part of CDC Software) for pharmaceutical and life sciences companies, IQMS for plastic processors, and Plexus Systems for automotive customers.

When it comes to SAP, it has a lot of customers that use SAP functionality to tie directly into low-level shop floor systems. However, there are also SAP customers that at the same time—at another site or division—use SAP in conjunction with an MES system. In the end, SAP will likely compete in the marketplace where it feels its functionality is competitive enough compared to other solutions. But the vendor acknowledges that customers have, for good reasons, installed other solutions, and will continue to do so. Thus, in order to be a trusted platform provider, it will want to be able to integrate with those systems.

Has SAP Nailed Plant Level Leadership with Lighthammer

At the end of June, SAP announced that it was delivering enhanced connectivity between the plant floor and the enterprise by acquiring Lighthammer Software Development Corporation (www.lighthammer.com), a privately-held supplier of enterprise manufacturing intelligence and collaborative manufacturing software, based in Exton, Pennsylvania (US). Lighthammer and SAP shared a vision of adaptive business networks (ABN), as illustrated by their longstanding partnership, during which Lighthammer was a premier "SAP Powered by NetWeaver" and SAP xApps partner. The company's approximately sixty employees have reportedly remained in their current facilities, and have become a part of SAP America and SAP Labs. Mufson Howe Hunter & Company LLC, a Philadelphia, Pennsylvania (US)-based investment bank, served as financial advisor to Lighthammer on this transaction.

At the time of the announcement, the two merging parties and formerly close partners believed that the acquisition would deliver value through improved manufacturing performance with more rapid time-to-value for SAP's installed base of more than 12,000 manufacturing customers. Lighthammer's Collaborative Manufacturing Suite (CMS), currently used by hundreds of companies worldwide, including more than 100 Fortune 500 manufacturing companies, was to be delivered as an SAP xApps composite application on the SAP NetWeaver platform, so as to provide enterprises with what SAP refers to as adaptive manufacturing (i.e., the ability of a manufacturer to profitably replenish the supply chain while dynamically adapting to unpredictable change). For background information on this acquisition, see The Importance of Plant Level Systems, Multipurpose SAP NetWeaver, and Enterprise Resource Planning Giants Eye the Shop Floor.

Lighthammer CMS has been re-branded as SAP xApp Manufacturing Integration and Intelligence (SAP xMII). Built on a modern, service oriented architecture (SOA)-based foundation, the former Lighthammer CMS provided a broad set of services that were required to relatively quickly assemble operational excellence applications in the areas of performance management, continuous improvement, and operational synchronization. The initial version of xMII is basically the former Lighthammer software, re-released in accordance with SAP software production methodology. Moving forward, the xMII team charter will be to help SAP manufacturing customers achieve better business performance through the synchronization of operations with business functions and continuous improvement. This translates into packaged manufacturing integration and intelligence solutions targeted for real time performance measurement. On the integration front, xMII will maintain a considerable degree of autonomy, but will also be closely associated with SAP NetWeaver, running on the SAP NetWeaver Web Application Server (WAS). This is because autonomy is required to match the unique product needs of manufacturing operations that are non-SAP shops or are driven by limited on-site information technology (IT) resources and skills, both of which can be an obstacle to leveraging the complex NetWeaver stack.

The SAP xMII solution will provide near real time visibility to manufacturing exceptions and performance variances, including root causes and business impacts. This will enable manufacturers and their production personnel to better adapt to change and to more rapidly respond to unforeseen demand and supply events. In addition, this combination reportedly will permit SAP to deliver real time transactional integration between enterprise resource planning (ERP) and plant floor systems. Another potential benefit will be the ability to provide unified, real time analytics and visualization, often referred to as manufacturing intelligence or plant intelligence, out-of-the box to manufacturing customers. Moreover, with the xMII solution, SAP is also aiming to enable user companies to leverage their current investments at a lower total cost of ownership (TCO). For more information, see Plant Intelligence as Glue for Dispersed Data?.

Using the Instrumentation, Systems, and Automation (ISA)-95 standards for process manufacturing interoperability (an emerging standard for interfacing low level industrial control level [ICL] code to business applications, which aims to further reduce the complexity of building custom connections to shop floor systems and thereby accelerate the time-to-value for the end customer), the Lighthammer and SAP manufacturing solution will exchange data and render them through SAP manufacturing intelligence dashboards, in order to deliver actionable intelligence in the form of alerts, reports, key performance indicators (KPI), and decision support to production personnel for right-time decision making (see Manufacturer's Nirvana—Real Time Actionable Information and SAP NetWeaver Background, Direction, and User Recommendations). The combined solution will thus allow production personnel to identify deviations in real time, provide drill-downs so as to understand the business and financial impact of the exceptions to be managed, and display the workflows so as to resolve them relatively rapidly and cost-effectively. The aim, of course, is improved productivity.
One idea that has been gaining in popularity lately is the inclusion of a value-adding process layer that can fairly easily link to scattered data sources, retrieve specific data, perform process logic, and deliver a meaningful output. Companies are applying manufacturing (plant) intelligence systems, such as the one supplied by Lighthammer, to aggregate appropriate information from plant-focused data sources into a meaningful context for presentation and analysis. These systems are a combination of integration or middleware platforms and business intelligence (BI) applications, since portals can aggregate and process manufacturing data for specific user communities, and then can share scheduling information across collaborative value chains. On the other hand, manufacturing intelligence systems can collect specific data from plant-focused devices and systems, and then analyze and present the information in dashboards and other KPI tracking systems. For more information, see Plant Intelligence as Glue for Dispersed Data?.

Integral to Lighthammer is the concept of non-intrusive connectivity, allowing legacy data sources to be integrated into the overall enterprise decision support scheme with minimal effort and no disruption to operations. The product's connectivity is not limited to data sources, as it can deliver information to a broad range of Web devices, including all major browsers, handheld or palmtop devices, Web phones, and enterprise applications. The visualization functionality includes a variety of charting components, support for wireless devices, and a set of wizards for automatic generation of Web page content for users with little or no technical expertise. There is also an animation editor in the Lighthammer technology that enables users to animate objects. For instance, one might want to be able to see a vessel actually filling up and see the level changing.

A comprehensive reporting module allows content from multiple data sources to be aggregated and correlated in a single report, which can be either "live" or static, and displayed in a browser, printed, or disseminated via e-mail. For some time, the product also has provided an "enterprise" option for multi-site views of production and manufacturing operations. This option enables multiple Illuminator (a core component of the former Lighthammer CMS suite that features solid extract, transform, and load [ETL] capabilities) servers throughout the business to provide a single, unified view of enterprise information. This allows, for example, a corporate process engineer to assist plants with process problems, or a production executive to view real time manufacturing results at a number of sites from a single web browser.

The Lighthammer technology connects to the three areas that users need connection to.

1. It connects to the main SAP modules.
2. It connects to the dashboard, so that users have KPIs coming out of both the SAP environment and the manufacturing systems.
3. It connects to a BI platform, which is useful as the data warehouse (i.e., SAP BW) environment is an important source of information. For example, a customer might want to capture information about reason codes for failure, so that when things are not made as they are supposed to be, all that information is captured in a data warehouse.

The problem is that, while information comes from production operations, goes to a data warehouse, and is viewed by the business, the very people who fed the information in typically do not see the data. In fact, because of a ripple-up effect of failures into the business down on the shop floor, sharing information through the manufacturing intelligence dashboards out of the BI layer can be as valuable in some cases as getting the information from the production level. For this reason, Lighthammer touts its ability to enable manufacturing in an adaptive environment by providing the business context for manufacturing data on an event-based integration in order to close this loop between the business and production levels.
At some SAP events, the two formerly independent partner vendors related a scenario-based example that was modeled around a paint process, which had both process industry characteristics (e.g., using reactors and vessels that handle liquids and fluids) and consumer packaged goods (CPG) industry characteristics (in that material is packaged and ultimately put in a warehouse or on a shelf).

The process that the SAP and Lighthammer teams have developed starts with material being added to a mixing and reaction process, whereby the product is extracted from the reaction, and then filtered, dried, and placed as an intermediate in cans. This particular process is also applicable to the pharmaceutical industry. The product is then packaged, palletized, labeled, and shipped to a distribution center, where quality tests are performed and the ISA-95 integration standard is employed to exchange schedule and performance data between the ERP and plant-level applications. To eliminate any latency or lack of synchronization, the production plan update and associated master data are automatically transmitted to the plant floor via SAP XI using the ISA-95 integration standard. The production plan synchronizes the plant systems, so that performance data, including status costs and quality information, are fed back into SAP in real time.

To be precise, the production schedule is sent from mySAP ERP to Lighthammer CMS (now SAP xMII), transmitted to the automated system, and then displayed on the manufacturing dashboard. After the batch is executed, Lighthammer aggregates production performance data and automatically updates the mySAP ERP inventory. Needless to say, the solution also tracks how things are developing throughout the batch, capturing not only the start and end points of a batch, but it also the intermediate ones. Thus, based on the sensitive data its captures as the batches are being manufactured and on some Six Sigma control analysis, Lighthammer technology detects quality problems, generates alerts, and quarantines the batches in mySAP ERP.

Quarantining a batch based on an anomaly in the process is the epitome of a closed-loop behavior. Production quality alerts appear in the dashboard, and the production supervisor can then drill down into the alert to perform a rapid root cause analysis. At this point, it is important to have not only the visibility to stop or change the process, but an understanding of why this problem has occurred so as to prevent it from reoccurring. The final stage would thus be the production supervisor initiating a corrective action to fix the problem, resolving the exception before it becomes a customer issue in an effort to have a continuous improvement environment.

Another often presented scenario leverages radio-frequency identification (RFID) technology. In this scenario, one might have paint cans containing a certain color or a certain blend that are moving more quickly than others. RFID-enabled business processes would indicate the pattern of these cans on the floor. In addition, the notification of material available for shipping would occur automatically and immediately. What one would like to be able to do is to respond at the manufacturing level to this change on "now what?" basis. For example, the sales department might want to rapidly capitalize on this opportunity. In this scenario, the production plan can be re-aligned in real time, based on the actual capability to deliver or the capability to promise (CTP), and the transient opportunity can be successfully realized since one has the ability to respond.

With the above scenario, we are talking once again about a closed-loop application, whereby Lighthammer receives the schedule and master data from SAP, and Lighthammer in turn uses SAP XI to deliver real time alerts and KPIs to the SAP dashboard. The dashboard itself is a composite application consisting of the XI views, the KPIs, and any accompanying alerts. There might be alerts coming out of the SAP environment and out of the external plant level systems as well. In which case, Lighthammer would be monitoring conditions, calculating KPIs, and further applying execution logic.

The possible value of this for customers could be multifold. First of all, it is a closed-loop system with real time synchronization—when a plant manager is looking at data from Lighthammer on his or her dashboard, it is live data. Moreover, users have control over how often the data is sent to the screen, which is done automatically in the background. The business implications of quality performance and delivery issues on the shop floor are thereby quantified and made visible, while proactive exception detection is supported to minimize the overall supply chain impact. In addition, production personnel are empowered with a productivity tool that enables them to access all the relevant documentation on one single dashboard or system, in order to manage by exception, leverage the dashboard as a decision support environment, perform tasks assisted by automated workflows, and initiate improvements and monitor their impact with the KPI dashboard.
In May 2005, almost immediately before the acquisition, Lighthammer unveiled CMS 11.0, which was a major upgrade of the flagship product, featuring enhanced scalability, multisite metrics, security, and traceability for regulatory compliance of the composite platform for building manufacturing intelligence applications. The new release also added features that extended the development environment's existing performance management, continuous improvement, and operational synchronization capabilities, which were SOA-based. Importantly, the new capabilities aimed at helping developers to more easily build and deploy applications that can be accessed across the distributed manufacturing enterprise. At least 60 percent of the code in version 11, which had been under development for about a year, had reportedly been rewritten. For end users, this might mean about 15 percent more functionality and a complete upward compatibility with existing applications.

Among the most significant enhancements to version 11.0 was the Security Manager service, which added unified user management and single sign-on capabilities for run-time applications. This means users will be able to access any CMS-built application regardless of the platform on which it runs. Therefore, CMS, which previously operated only on Microsoft Windows-based systems, can now run on other operating environments, such as Linux, Sun Solaris, and HP-UX. The service also allows integration with a wide range of third-party authentication systems, including SAP, lightweight directory access protocol (LDAP) using Active Directory, security assertion markup language (SAML), Windows Domains, Kerberos from Massachusetts Institute of Technology (MIT), and others. These features should allow customers to manage user roles, memberships, and attributes better, as well as to define authentication or authorization services either from existing enterprise user management directories or through the Lighthammer application. This service should thus provide the ability to implement a security strategy that could fit virtually any existing enterprise architecture and should extend "single sign-on" into the domain of plant applications, improving compliance.

Additional compliance and traceability features that were added include an electronic signature service and a multilevel confirmation or challenge capability, which securely controls and documents user actions for regulatory compliance with 21 Code of Federal Regulations Part 11 (21 CFR 11), Sarbanes-Oxley (SOX), and other regulations. The enterprise application integration (EAI) capabilities have also been enhanced, with the addition of new business logic capabilities that take advantage of Web services in SAP NetWeaver to simplify data integration between plant systems and enterprise systems. Last but not least, the generally available CMS version 11.0 laid the groundwork for another upgrade set. Currently, the product is built mostly on Java, but the logic engine is based on Microsoft .NET. The next release, however, will be 100 percent Java-based, which should give customers a much broader choice of development platform.

Lighthammer's process manufacturing industry expertise and foresight in developing intelligent manufacturing middleware was helped by its early commitment to open technologies like the ISA-95 standard, Java, extensible markup language (XML), and SOA. Even earlier releases featured Lighthammer's leadership in the deployment of these open technologies as enablers for acquisition, analysis, distribution, and presentation of information from manufacturing systems. Lighthammer CMS functionality has long included built-in transformation of data into any standard XML message structure, such as Microsoft BizTalk, RosettaNet, and others, as well as the ability to interface with peer plant-level or enterprise-level systems using XML as the default data format for both incoming and outgoing data. Back in 2001, Illuminator 8.5 introduced a breakthrough Intelligent Agent subsystem, which could be used to enable inter-application messaging upon detection of production events or exceptions; automated calculation of KPI metrics; automatic transfer of information between XML, database, and e-mail sources; the gathering and conversion of data from external Web sources; and much more.


Free Blogger Templates by Isnaini Dot Com and Bridal Dresses. Powered by Blogger