Blog

Ed-itorial

Driving Brick and Mortar Sales with Big Data!

Edward McLeod  |  

Amazon is a beast! They and other online retailers are digging deeply into your brick and mortar (B&M) sales. Most stores cannot survive if they lose 25% of their gross revenues. Per an April 15, 2017 New York Times article , American B&M retail sales are at a tipping point. 89K general merchandise jobs have been lost since October, 2016 ... more than the entire US coal industry! With approximately 1 in 10 Americans working in retail, B&M’s future looks bleak.

Your B&M business is struggling … what can you do? How can you provide B&M shoppers with an exceptional, Amazon-like experience? It’s a daunting task. It can be solved, but it requires cross-manufacturer and retailer collaboration.

The B&M solution builds upon the GS1 standard I co-chaired to drive packaging label standards in 2012. It leverages the label content communication standard and key marketing information to consumers at the point of purchase (PoP). You may even be participating in GS1’s current US Mobile Scan efforts. Pushing content to shoppers is necessary, but not sufficient.

What if …

·       You knew when your shoppers were in the store?

·       You could deliver product benefits, reviews, and other content to their mobile devices?

·       You could alert them to nutritional facts or ingredients to which they had allergies?

·       You knew shopper preferences and could personalize marketing messages?

·       You could offer coupons or loyalty points when they were looking at your products?

·       You could suggest additional purchases in real-time?

·       You could verify your PoP materials were deployed in every store?

·       You could assess promotional effectiveness based on demographics?

·       And what if a single solution worked across every retailer?

This is the promise of Big Data and communicating with shoppers at the PoP! This sounds great, but how can you deliver against this objective? The good news is the required technology exists today, but there are go-to-market issues.

Communicating with shoppers at the PoP works like this:

1)      The shopper scans product GTINs with our mobile app. The app includes their personal preferences to return the information that is most meaningful to them.

2)      The app looks up the product information.

3)      Based on the product scanned and the shopper’s preferences, personalized marketing messages are delivered to the shopper.

4)      The Magic Moment! The shopper receives data that is meaningful to them to assist their purchasing decisions.

5)      Creation of the Mother Lode! The shopper’s scans record who scanned which products, at what location, and when they did it!

6)      Manufacturers and retailers integrate the shopper actions with their POS data to quantify the sales lift versus history.

The road to success is cross-manufacturer collaboration! The entire approach hinges on delighting the shopper at the “Magic Moment.” They must receive personalized data providing them with a superior shopping experience! This can only be accomplished with critical mass and a clear identifier so shoppers know which products are, and are NOT participating in the program. If few products participate in the program, the shopper will lose interest, and the solution fails. Similarly, if the shopper cannot differentiate participating from non-participating products, they will stop using the app. Nobody likes taking an extra process step (scanning the product) to yield no results!

So, what must be true to delight the shopper and build the Big Shopper Data?

·       At least 30% of a retailer’s product volume has product information available to shoppers.

·       Participating products must have a common identifier on the front label notifying shoppers that product information is available. Think of a recycle symbol, rather than a QR code. The symbols purpose is to notify the shopper they can scan the GTIN to secure product information.

·       The mobile app is the key! It must be easy to use and provide personalized value to shoppers.

 

Key issues, risks, mitigation plans, and value

•       Shoppers / Consumers

•       Risks / Issues

•       Lack of program knowledge - Advertising campaign

•       Acquisition costs / how to get them to engage (the “Magic Moment”) – verify and deliver a superior shopper solution

•       How to recognize the participating products – Label icon

•       Benefits

•       Decision-making data when shopping

•       Rewards programs

•       Ingredients

•       Information during consumption

•       A single app across retailers

•       BIGGER print

 

•       Retailers

•       Risks / Issues

•       Some top retailers already have shopper apps. The shopper-centric app generates two issues:

1.       Retailers no longer control all of the shopper data

2.       Levels the cross-retailer playing field. Today’s leading retailers lose their competitive advantage with a centralized big shopper data and analytics approach.

•       Mitigation approach: Share the information back with retailers. They secure additional product insights based on the specific shopper requests.

•       In-store wifi – not required, but an opportunity to delight shoppers

•       Benefits

•       Incremental sales via improved product data

•       Need to act quickly to save brick and mortar. If online sales take 25% of a store’s gross, they will likely fail

 

•       The eCommerce competition

•       Risks / Issues

•       eCommerce growth is explosive – Compete by providing a superior B&M shopping experience

•       Benefits

•       Drive brand loyalty vs the lowest price available on the web

 

•       Manufacturers / Brand Owners

•       Risks / Issues

•       Cost – Solution development, deploy, and support. Offset by big shopper data value

•       Top retailer friction from the competing app – Share shopper data with retailers

•       Levels the playing field for less sophisticated manufacturers – Manufacturers compete on their analytical capabilities

•       Lack of structured label content – leverage existing application solutions based on the GS1 labeling standard

•       Benefits

•       Personalized, in-store marketing

•       Incremental sales

•       Shopper loyalty via rewards programs

•       Perceived as more tech savvy and transparent than competitors

Contact me now at Ed.McLeod@cleveltraining.com to participate in a consortium to deliver this solution. Success requires critical mass and a willingness to place a common identifier on your product’s front label.  

© 2017, C-Level Training, LLC

Permalink

Product Development: Is it Opening Night, Every Night?

Edward McLeod  |  

On opening night, something usually goes wrong. People learn from it and improve future performances. But what if things never got better ... the same mistakes were made time after time?

This is often the case with new product development. The people “in” the process work diligently, but the antiquated IT solutions let them down. Historically speaking, IT applications were built to optimize one, or a small group of activities, rather than the end-to-end business process. These silo’d solutions require people to manually interpret the output of the previous step, rather than immediately moving forward with trusted data. These steps represent “seams” in your product development process, and are potential points of failure. Worse, the data is often dispersed across many applications or even hidden on researchers’ hard drives. This “dark data” makes it difficult to develop a new product or even leverage data from previous efforts. The result is repeated defects; “opening night, every night.”

So how bad is the problem? Let’s discuss formulating a new product. I call it "the ankle bone is connected to the shin bone."

We begin with identifying an unmet consumer need. We must define “what” we are trying to accomplish. For example, we’re trying to remove stubborn stains from clothing. Three potential ways to remove them include:

1.       Add a bleach component to the detergent

2.       Develop a spray-on solution prior to washing the clothes

3.       Develop a real-time stain removal agent

Each approach has pros and cons. We must clearly identify our target customer and “what” we’re trying to deliver. All approaches can be part of the normal clothes-washing process. However, approach three also offers immediate and mobile stain removal.

Scientists begin experimenting with different ingredients (aka materials) to identify potential solutions. As new materials are verified, they are added to the material master, along with the purpose for which they’re approved. Different materials are combined in different ways to create new batches. The batches are evaluated against success criteria to determine how well they deliver against the intent. The scientists iterate through the formulation and testing process, recording test results from each combination.

During the development process, they must also ensure the product is fit for use from a manufacturing, customer (retailer), and consumer perspective. In the manufacturing realm, you may have a great new product, but if your plants can’t produce it without inventing a new production component, the cost and time to deliver your new product increases. From a customer perspective, the product and packaging must fit on their shelves, and be available when they set their planograms. The consumer may have other guardrails against which you must design. For example, how easy is it to open the product? What happens if I drop it? These are all factors that must be taken into consideration in the design of your new product and packaging.

The primary and secondary packaging must also be designed and approved during the product development cycle, including creating the label graphics and copy content. If this product is offered for sale globally, you’ll also need to manage the copy in several languages. 

As the correct way(s) to manufacture the new product are uncovered, procedures and quality specifications are generated to support the make, pack, and ship processes.

These represent a subset of the things you need to do to bring a new product to market. Can you imagine the productivity gain you’d gain by improving the seams? Employee productivity, product quality, speed of delivery, and repeatability are vastly improved through integrated data.

I’ve navigated these waters and can help you deliver business results faster. Transformed processes, data definition and ownership, vendor selection, organizational change, and several other factors will impact your effectiveness. Contact me to learn more.

Permalink

PLM Environment Choices

Edward McLeod  |  

Your PLM software provider may have several different environment options to deliver your solutions. They can range from you owning everything in-house via capital expenditures, to a variety of cloud solutions. Renting hardware only is called Infrastructure as a Service (IaaS). Renting hardware and some software is referred to as Platform as a Service (PaaS), and finally a vendor can supply all hardware and software via Software as a Service (SaaS). As you move through the continuum of in-house to SaaS, accountability and control shifts to your vendor(s). Each approach has advantages and risks. You must decide how much risk you can tolerate when outsourcing some or all of the components. Part of your decision is to determine what is best for your company at this point in time and for the foreseeable future. Circumstances change over time. Make sure you document “why” you select a given approach.

Traditionally, PLM has used an in-house approach. Your IT group configures and/or customizes the vendor’s software on their choice of the vendor’s approved hardware, operating system, and database software. This approach maximizes your business process flexibility, and focuses operational accountability on your internal resources. Customization should only be considered if your internal process provides you with a significant competitive advantage. Think long and hard before committing to a custom solution.

You can choose to rent hardware, software, and support capabilities. However when you rent a service, control and accountability transfers to the vendor. Keep in mind many suppliers in addition to your PLM partner offer environments (e.g. PaaS) on which you can host your solution. Choosing to go this route requires your Procurement organization vet them too.

Infrastructure as a Service (IaaS) is an offering to rent the physical hardware on which you install PLM and other software to build your solution. Many companies leverage this approach today, sometimes as a private cloud in their own data centers. They avoid capital expenditures by having a third party supply the physical hardware. This is a relatively low risk approach, especially if the stand-alone hardware resides in your data center. However, hardware shared with other customers in your supplier’s data center is riskier (see data security concerns below). The software remains a capital purchase for your company. Another risk is your inability to improve performance by swapping out components and/or tuning them differently. Performance may only be improved by procuring more of the same IaaS environment from your supplier. Further, your supplier will make environment upgrades / changes periodically, which could “break” some of your existing functionality, especially custom code.

Platform as a Service (PaaS) extends the IaaS concept by adding the software on which you can develop capabilities. In the case of enterprise software, your partner supplies you with the hardware and commercial off-the-shelf (COTS) software to configure / customize a solution meeting your business process requirements. Your organization builds and operates the configured / customized solution. The vendor owns all hardware and software upgrades and timing, which again can “break” any customizations you may have developed.

Software as a Service (SaaS) supplies you with COTS capabilities with no option to customize. No customization is a huge benefit, as future hardware and software upgrades are vendor-tested and should continue working over time. Further, the core process and associated analytical solutions will likely advance faster because of multi-customer learnings across the shared platform. Other reasons to leverage this approach include zero capital expenditures and a single supplier. If something goes wrong, everyone knows who is accountable to fix the problem. The SaaS approach enables you to purchase what you need, rather than securing and training your own IT resources. The solution must still be configured to include your master data, phase-gate criteria, etc., so it requires up-front development and testing.

Data security is the key concern most companies have when transferring accountability to a vendor. We frequently hear of data breaches and the corresponding financial loss. Your security risk lies not only with your supplier, but also with their other customers. Your PLM solution manages secret information, including formulas and making instructions. You must evaluate whether you can trust a partner to secure your data in a cloud environment shared with their other customers. In theory, cloud suppliers are the experts and have the best security resources. This is not always true. Be very careful about managing your data outside your firewall, as once it’s breached, it cannot be recaptured.

For each environment option a potential partner claims, ensure it applies to all of the applications you plan to use. If some of their application software requires a different operating system or hardware, you’ll have cross-environment integrations to design and build. They must deliver the full suite of environment requirements before you can quantify integration costs. Based on your analysis, a PLM partner may only be able to deliver an ETE integrated solution by using multiple physical environments.

If using anything other than an in-house approach, contract negotiations must include Service Level Agreements (SLAs) and maximum cloud cost increments over the coming years. Negotiating next year’s costs after transitioning all of your users leaves you with little leverage. Your cloud service metrics should be based upon the experience you choose to supply your users, rather than the number of CPU cycles consumed. If your contract is based on CPU cycles, your vendor has little incentive to upgrade hardware, as their profit increases by selling you more cycles on the slower processors.

Beyond environments offered, you must also drive clarity on your potential partner’s software support. This is critical to understand for in-house and IaaS approaches. Software isn’t perfect. Things break, and when they do, your partner, rather than you may need to supply the fix. Quantifying the number of back-versions your partner supports drives insights into how frequently you must upgrade your base software. Almost all vendors support their software at least two major releases back; most longer.  This is important as most providers upgrade their software at least annually. Every time you upgrade hardware or software, you must test to ensure all of your functionality still works. Testing costs money. Your software partner will have a limited number of back-versions in which they’ll install fixes and/or new capabilities. Based on the number of versions back your vendor patches, and the frequency of their versions, you can calculate the maximum duration you can keep a given version of their software and receive fixes. This factors into your on-going operational costs.

Permalink

Prepare for the Future of Packaging and Consumer Communication

Edward McLeod  |  

Packaging label creation and the way consumers interact with products is under a major transformation. GS1 standards are in place to structure and communicate label content (www.gs1.org/request-packaging-artwork-standard). And with the GS1 cloud efforts, communicating product information to consumers at the POP is around the corner. So what does this mean for your brand?

I believe the next three phases of packaging labels and electronic interaction with consumers include:

·       Phase I - Initial consumer interaction: The first is the deployment of the GS1 cloud. It enables consumers to scan the GTIN using a mobile app to learn more about a product through a limited attribute set.

·       Phase II – Personalization: Creation of an app with consumer profiles to facilitate their shopping choices. For example, a user may select from ingredients to which they are allergic. If the product contains these ingredients, the app highlights ingredients as inconsistent with user preferences (e.g. color ingredients red). Think about Pharma and how this could help consumers identify potential drug interactions. The app could also be used for additional POP content demonstrating why product A is a better value than its competitors. Facilitating this step requires leveraging the GS1 labeling standards above in combination with the GS1 cloud.

·       Phase III – Legislation: Completion of Phase II means all label information is available to consumers online as structured content. They no longer need to read the label. Rather, they can scan the item and verify it meets their criteria based on their user profile. This provides manufacturers with an incredible opportunity: Remove detailed content from the label! So why is this important? Because we make mistakes when generating thousands of different labels each year. If the mistake is significant, the product must be recalled from the shelves, which is very costly in terms of execution, and more importantly, brand image. Future label corrections are delivered electronically, rather than physically. No more label-based recalls!

Achieving this goal requires action. Your company must:

·       Become a GS1 member (if you aren’t already).

·       Engage in the GS1 cloud efforts.

·       Structure your label content per the GS1 standards.

·       Work with GS1 to expand mobile content to include the full label attribute set following the successful GS1 cloud pilot completion.

·       And finally, form a consortium to lobby congress to change packaging laws based on the successful electronic delivery of full label content.

I feel a sense of urgency based on the current business-friendly political environment. Leveraging this approach has start-up costs, but provides annual benefits in terms of label creation and modification, as well as consumer satisfaction. Please contact me if you’d like to discuss this in more detail.

Permalink

Dilapidated Data

Edward McLeod  |  

Sales and market share have taken a downward trend every month in the last quarter. Your company’s profitability and cash flow are at risk. As the leader, you need to act. IT solutions are in place to provide insights, but you don’t trust the data. What are you going to do?

In today’s fast changing environment, you must quickly decide how to adjust to take advantage of changing market conditions. Analysis paralysis leads to failure.  Further, poor data quality leads to poor decisions. To improve your ability to make faster and better business decisions, companies require quick access to quality data.

You might think, “What an amazing grasp of the obvious!” However, our reality is we have disparate systems across, and even within business processes, yielding inconsistent or “dilapidated data.” This generates confusion not only on how to resolve the business issue, but even clarifying the root cause. How do we build the data quality enabling quick decision-making?

The road to the “promised land” starts with treating data as a corporate asset. Identify data owners who understand how the data is used, and provide them with a horizontal governance process and the authority to exercise it. Owners must manage their data, ensuring the standards, security, system of record, and consumers are documented and governed from cradle to grave. The data owners are your first line of defense to deliver data quality. They supply input to the portfolio process quantifying the impact of requested changes. This requires a horizontal approach spanning business processes to ensure end-to-end data integrity.

There is no silver bullet. This is heavy lifting. Leaders must treat data as a corporate asset to correctly respond to ever-changing business conditions and emerging opportunities. Data owners, dictionaries, security management, maintenance and distribution processes contribute to healthy data and faster business decisions. Act now to improve your data quality and decision-making speed.

Permalink

Phase-gates, deliverables, and decision accountability

Edward McLeod  |  

Successful enterprise software delivery requires a phase-gate process. The Project Management Institute (PMI.org) and many other phase-gate examples are available on the web. For this discussion, I’ve selected a generic Wikipedia version. We’ll discuss requirements and release management, along with different development approaches (e.g. agile) in a future blog.

The phase-gate concept identifies a series of efforts or “phases.” Each is separated by a “gate” ensuring the criteria for the previous phase were successfully delivered. Each phase has specific deliverables, with clear delivery and decision-making accountability. The HR component of “who will do what,” and “who will make which decisions” is critical to drive productivity. Failure to clearly identify “who will do what” leads to missed expectations and delayed business results. Failure to align “who will make which decisions” yields dysfunctional behavior. Business skills are required to identify “what” the target is, while IT skills determine “how” to technically deliver it. As you progress through the phase-gate process, decision accountability transitions from “what” to “how” resources.

Gating meetings are critical to your enterprise’s success and should not be taken lightly. They represent go/no-go decisions to continue allocating your company’s limited resources against this effort. The gating process leverages checks and balances throughout, halting projects not meeting their business objectives. As a leader, you must celebrate not only successful project delivery, but also killing projects not meeting their success criteria. The team has not necessarily failed because the project couldn’t deliver results. Rather, the idea either wasn’t good enough, or the current business environment or technical capabilities are unable to deliver the objectives.  Continuing to pour your people and money into a losing project is very painful and is very ineffective for your company. 

 

 

 

Permalink

Key processes enabling enterprise software success ... What, How, and Who

Edward McLeod  |  

Today’s C-level executives must deliver business results faster with on-going quality, using fewer resources. Driving productivity via integrated Information Technology is the path to the future and strategic IT deliverables will continue playing a …

Today’s C-level executives must deliver business results faster with on-going quality, using fewer resources. Driving productivity via integrated Information Technology is the path to the future and strategic IT deliverables will continue playing a larger role in an enterprise’s success or failure. Companies viewing IT as merely an expense cannot compete in the fast-paced global business environment. However, IT alone will not drive success. Rather, transformed business processes, brought to life through integrated data, cost-effective applications, and scalable technology are the solution. These are the promises of enterprise software.  

Successful leadership, process, and politics requires an aligned plan. Thinking with the end in mind, organizations must define “what” to clarify the desired outcome, “how” they will approach the problem, and “who” is accountable for which components to successfully deliver business results faster. The key to success is clearly understanding the problem before trying to solve it.  

You’ll need a technique to prioritize your improvement opportunities relative to all of the other company requests. This is the portfolio process. The direction comes from the top, based on needs and desires across the company. The portfolio process identifies, prioritizes, and allocates resources to the top choices. Importantly, the process also identifies efforts you will NOT work. Identifying and clearly communicating those requests for which no resources are planned is critical. If some projects aren’t stopped or at least delayed, your resources are spread too thinly to deliver against your most important efforts.  

Peeling back the onion layers, the portfolio process leads into a phase-gate approach. A phase-gate process defines the steps and checkpoints in the development of new capabilities. It begins with an idea, progresses to analysis, a detailed business case, development, testing, launch, and finally a review of results. Following each phase along the way is a gating meeting resulting in a go/no-go decision.  Meeting results can be, yes - move forward; no – stop all efforts, or go back and do more work prior to leadership making the decision. It is imperative to review each phase’s results and provide the team with formal alignment to proceed. Continuing through the steps without checkpoints will almost certainly yield an out of control effort, with an unclear target, and ultimately unrealized business benefits. 

While the portfolio and stage-gate processes are necessary, they are not sufficient to drive success. Another onion layer must be peeled back, clearly identifying who is accountable for which deliverables in each stage. Accountabilities will shift across stages. In the early stages (e.g. ideation and analysis), business resources define “what’s desired.” Accountability transitions to IT resources to define “what’s possible,” based on current constraints. The key point is resources will have different responsibilities throughout the process. Everyone must understand the deliverables for which they are accountable, and the components for which they’ll hold their peers accountable. Failure to define up-front, who is accountable for which deliverables, yields dysfunctional behavior.

In my next blog, I’ll dig deeper into the phase-gate and HR processes to drive clarity on accountability transitions as your progress through the phase-gate process.

Permalink

Welcome to C-Level Training

Edward McLeod  |  

Since retiring from P&G last summer, I’ve captured my learnings from 35 years of application development in my new book, Successful Enterprise Software. It's an executive’s guide detailing the leadership, process, and politics required to drive business results faster!

Statistics from Gartner, McKinsey, and others identify troubled or failed mega-projects costing $1M+ at 65%. Adding new technologies and remote sites pushes the failure rate above 80%! Why are we doing so poorly? Key reasons affecting mega-project success include leadership, process, and politics. Modifying the way an organization works requires top-down leadership, focused on transforming the way work is executed. Successful delivery is enabled by aligned accountability and work process. Collaboration, based on understanding politics, is the third component to deliver against the “One Team, One Dream” mantra.

Transforming your critical business processes through enterprise software is a multi-million dollar investment, which will hopefully be in place for decades. Getting it “right” is crucial to your company’s success.  To be clear, your enterprise solution will never be perfect. And you don’t want it perfect. Striving for perfection will drive costs dramatically higher and force significantly longer timelines, while providing only marginal improvement. As a C-level executive, you must collaborate across functions, leading the organization to deliver the correct balance of capability and quality, relative to cost. You must also require and support development of on-going processes ensuring operational excellence. The process your team uses, the organizational structure, and senior management support greatly affects the success of enterprise software solutions.

Learn additional details by requesting a free introductory call, download the free executive summary, or purchase Successful Enterprise Software. At $1 per experience-year, it’s a bargain!

Permalink