Successful Smart Grid Architecture

The smart grid is progressing well on several fronts. Groups such as the Grid Wise Alliance, events such as Grid Week, and national policy citations such as the American Recovery and Reinvestment Act in the U.S., for example, have all brought more positive attention to this opportunity. The boom in distributed renewable energy and its demands for a bidirectional grid are driving the need forward, as are sentiments for improving consumer control and awareness, giving customers the ability to engage in real-time energy conservation.

On the technology front, advances in wireless and other data communications make wide-area sensor networks more feasible. Distributed computation is certainly more powerful – just consider your iPod! Even architectural issues such as interoperability are now being addressed in their own forums such as Grid Inter-Op. It seems that the recipe for a smart grid is coming together in a way that many who envisioned it would be proud. But to avoid making a gooey mess in the oven, an overall architecture that carefully considers seven key ingredients for success must first exist.

Sources of Data

Utilities have eons of operational data: both real time and archival, both static (such as nodal diagrams within distribution management systems) and dynamic (such as switching orders). There is a wealth of information generated by field crews, and from root-cause analyses of past system failures. Advanced metering infrastructure (AMI) implementations become a fine-grained distribution sensor network feeding communication aggregation systems such as Silver Springs Network’s Utility IQ or Trilliant’s Secure Mesh Network.

These data sources need to be architected to be available to enhance, support and provide context for real-time data coming in from new intelligent electronic devices (IEDs) and other smart grid devices. In an era of renewable energy sources, grid connection controllers become yet another data source. With renewables, micro-scale weather forecasting such as IBM Research’s Deep Thunder can provide valuable context for grid operation.

Data Models

Once data is obtained, in order to preserve its value in a standard format, one can think in terms of an extensible markup language (XML)-oriented database. Modern implementations of these databases have improved performance characteristics, and the International Engineering Consortium (IEC) common information/ generic interface definition (CIM/GID) model, though oriented more to assets than operations, is a front-running candidate for consideration.

Newer entries, such as device language message specification – coincidence-ordered subsets expectation maximization (DLMS-COSEM) for AMI, are also coming into practice. Sometimes, more important than the technical implementation of the data, however, is the model that is employed. A well-designed data model not only makes exchange of data and legacy program adjustments easier, but it can also help the applicability of security and performance requirements. The existence of data models is often a good indicator of an intact governance process, for it facilitates use of the data by multiple applications.

Communications

Customer workshops and blueprinting sessions have shown that one of the most common issues needing to be addressed is the design of the wide-area communication system. Data communications architecture affects data rate performance, the cost of distributed intelligence and the identification of security susceptibilities.

There is no single communications technology that is suitable for all utilities, or even for all operational areas across any individual utility. Rural areas may be served by broadband over powerline (BPL), while urban areas benefit from multi-protocol label switching (MPLS) and purpose- designed mesh networks, enhanced by their proximity to fiber.

In the future, there could be entirely new choices in communications. So, the smart grid architect needs to focus on security, standardized interfaces to accept new technology, enablement of remote configuration of devices to minimize any touching of smart grid devices once installed, and future-proofing the protocols.

The architecture should also be traceable to the business case. This needs to include probable use cases that may not be in the PUC filing, such as AMI now, but smart grid later. Few utilities will be pleased with the idea of a communication network rebuild within five years of deploying an AMI-only network.

Communications architecture must also consider power outages, so battery backup, solar recharging, or other equipment may be required. Even arcane details such as “Will the antenna on a wireless device be the first thing to blow off in a hurricane?” need to be considered.

Security

Certainly, the smart grid’s purpose is to enhance network reliability, not lower its security. But with the advent of North American Reliability Corp. Critical Infrastructure Protection (NERC-CIP), security has risen to become a prime consideration, usually addressed in phase one of the smart grid architecture.

Unlike the data center, field-deployed security has many new situations and challenges. There is security at the substation – for example, who can access what networks, and when, within the control center. At the other end, security of the meter data in a proprietary AMI system needs to be addressed so that only authorized applications and personnel can access the data.

Service oriented architecture (SOA) appliances are network devices to enable integration and help provide security at the Web services message level. These typically include an integration device, which streamlines SOA infrastructures; an XML accelerator, which offloads XML processing; and an XML security gateway, which helps provide message-level, Web-services security. A security gateway helps to ensure that only authorized applications are allowed to access the data, whether an IP meter or an IED. SOA appliance security features complement the SOA security management capabilities of software.

Proper architectures could address dynamic, trusted virtual security domains, and be combined not only with intrusion protection systems, but anomaly detection systems. If hackers can introduce viruses in data (such as malformed video images that leverage faults in media players), then similar concerns should be under discussion with smart grid data. Is messing with 300 MegaWatts (MW) of demand response much different than cyber attacking a 300 MW generator?

Analytics

A smart grid cynic might say, “Who is going to look at all of this new data?” That is where analytics supports the processing, interpretation and correlation of the flood of new grid observations. One part of the analytics would be performed by existing applications. This is where data models and integration play a key role. Another part of the analytics dimension is with new applications and the ability of engineers to use a workbench to create their customized analytics dashboard in a self-service model.

Many utilities have power system engineers in a back office using spreadsheets; part of the smart grid concept is that all data is available to the community to use modern tools to analyze and predict grid operation. Analytics may need a dedicated data bus, separate from an enterprise service bus (ESB) or enterprise SOA bus, to meet the timeliness and quality of service to support operational analytics.

A two-tier or three-tier (if one considers the substations) bus is an architectural approach to segregate data by speed and still maintain interconnections that support a holistic view of the operation. Connections to standard industry tools such as ABB’s NEPLAN® or Siemens Power Technologies International PSS®E, or general tools such as MatLab, should be considered at design time, rather than as an additional expense commitment after smart grid commissioning.

Integration

Once data is sensed, securely communicated, modeled and analyzed, the results need to be applied for business optimization. This means new smart grid data gets integrated with existing applications, and metadata locked in legacy systems is made available to provide meaningful context.

This is typically accomplished by enabling systems as services per the classic SOA model. However, issues of common data formats, data integrity and name services must be considered. Data integrity includes verification and cross-correlation of information for validity, and designation of authoritative sources and specific personnel who own the data.

Name services addresses the common issue of an asset – whether transformer or truck – having multiple names in multiple systems. An example might be a substation that has a location name, such as Walden; a geographic information system (GIS) identifier such as latitude and longitude; a map name such as nearest cross streets; a capital asset number in the financial system; a logical name in the distribution system topology; an abbreviated logical name to fit in the distribution management system graphical user interface (DMS GUI); and an IP address for the main network router in the substation.

Different applications may know new data by association with one of those names, and that name may need translation to be used in a query with another application. While rewriting the applications to a common model may seem appealing, it may very well send a CIO into shock. While the smart grid should help propagate intelligence throughout the utility, this doesn’t necessarily mean to replace everything, but it should “information-enable” everything.

Interoperability is essential at both a service level and at the application level. Some vendors focus more at the service, but consider, for example, making a cell phone call from the U.S. to France – your voice data may well be code division multiple access (CDMA) in the U.S., travel by microwave and fiber along its path, and emerge in France in a global system for mobile (GSM) environment, yet your speech, the “application level data,” is retained transparently (though technology does not yet address accents!).

Hardware

The world of computerized solutions does not speak to software alone. For instance, AMI storage consolidation addresses the concern that the volume of data coming into the utility will be increasing exponentially. As more meter data can be read in an on-demand fashion, data analytics will be employed to properly understand it all, requiring a sound hardware architecture to manage, back-up and feed the data into the analytics engines. In particular, storage is needed in the head-end systems and the meter-data management systems (MDMS).

Head-end systems pull data from the meters to provide management functionality while the MDMS collects data from head-end systems and validates it. Then the data can be used by billing and other business applications. Data in both the head-end systems and the master copy of the MDMS is replicated into multiple copies for full back up and disaster recovery. For MDMS, the master database that stores all the aggregated data is replicated for other business applications, such as customer portal or data analytics, so that the master copy of the data is not tampered with.

Since smart grid is essentially performing in real time, and the electricity business is non-stop, one must think of hardware and software solutions as needing to be fail-safe with automated redundancy. The AMI data especially needs to be reliable. The key factors then become: operating system stability; hardware true memory access speed and range; server and power supply reliability; file system redundancy such as a JFS; and techniques such as FlashCopy to provide a point-in-time copy of a logical drive.

Flash Copy can be useful in speeding up database hot backups and restore. VolumeCopy can extend the replication functionality by providing the ability to copy contents of one volume to another. Enhanced remote mirroring (Global Mirror, Global Copy and Metro Mirror) can provide the ability to mirror data from one storage system to another, over extended distances.

Conclusion

Those are seven key ingredients for designing or evaluating a recipe for success with regard to implementing the smart grid at your utility. Addressing these dimensions will help achieve a solid foundation for a comprehensive smart grid computing system architecture.

Business Process Improvement

In the past, the utility industry could consider itself exempt from market drivers like those listed above. However, today’s utilities are immersed in a sea of change. Customers demand reliable power in unlimited supply, generated in environmentally friendly ways without increased cost. All the while regulators are telling consumers to “change the way they are using energy or be ready to pay more,” and the Department of Energy is calling for utilities to make significant reductions in usage by 2020 [1].

“The consumer’s concept of quality will no longer be measured by only the physical attributes of the product – it will extend to the process of how the product is made, including product safety, environmental compliance and social responsibility compliance.”

– Victor Fang, chairman of Li and Fang,
in the 2008 IBM CEO Study

If these issues are not enough, couple them with a loss of knowledge and skill due to an aging workforce, an ever-increasing amount of automation and technology being introduced into our infrastructure with few standards, tightening bond markets and economic declines requiring us to do more with less. Now more than ever the industry needs to redefine our core competencies, identify key customers and their requirements, and define processes that meet or exceed their expectations. Business process improvement is essential to ensure future success for utilities.

There is no need to reinvent the wheel and develop a model for utilities to address business process improvement. One already exists that offers the most holistic approach to process improvement today. It is not new, but like any successful management method, it has been modified and refined to meet continuously changing business needs.

It is agnostic in the way it addresses methods used for analysis and process improvement such as Lean, Six Sigma and other tools; but serves as a framework for achieving results in any industry. It is the Baldrige Criteria for Performance Excellence (see Figure 1).

The Criteria for Performance Excellence is designed to assist organizations to focus on strategy-driven performance while addressing key decisions driving both short-term and long-term organizational sustainability in a dynamic environment. Is it possible that this framework was designed for times such as these in the utility industry?

The criteria are essentially simple in design. They are broken into seven categories as shown in figure 2; leadership, strategic planning, customer focus, measurement, analysis and knowledge management, workforce focus, process management and results.

In this model, measurement, analysis and knowledge management establish the foundation. There are two triads. On the left hand side, leadership, strategic planning and customer focus make up the leadership triad. On the right hand side of the model, workforce focus, process management and results make up the results triad. The alignment and integration of these essential elements of business create a framework for continuous improvement. This model should appear familiar in concept to industry leaders; there is not a single utility in the industry that does not identify with these categories in some form.

The criteria are built to elicit a response through the use of how and what questions that ask about key processes and their deployment throughout the organization. On face value, these questions appear to be simple. However, as you respond to them, you will realize their linkage and begin to identify opportunities for improvement that are essential to future success. Leaders wishing to begin this effort should not be surprised by the depth of the questions and the relatively few members within your organization who will be able to provide complete answers.

In assessment of the model’s ability to meet utility industry needs, let’s discuss each category in greater detail, provide relevance to the utility industry and include key questions for you to consider as you begin to assess your own organization’s performance.

Leadership: Who could argue that the current demand for leadership in utilities is more critical today than ever before in our history? Changes in energy markets are bringing with them increased levels of accountability, a greater focus on regulatory, legal and ethical requirements, a need for long-term viability and sustainability, and increased expectations of community support. Today’s leaders are expected to achieve ever increasing levels of operational performance while operating on less margin than ever before.

“The leadership category examines how senior leaders’ personal actions guide and sustain the organization. Also examined are the organization’s governance system and how it fulfills legal, ethical and societal responsibilities as well as how it selects and supports key communities [2].”

Strategic Planning: Does your utility have a strategic plan? Not a dust-laden document sitting on a bookshelf or a financial budget; but a plan that identifies strategic objectives and action plans to address short and long-term goals. Our current business environment demands that we identify our core competencies (and more importantly what are not our core competencies), identify strategic challenges to organizational success, recognize strategic advantages and develop plans that ensure our efforts are focused on objectives that will ensure achievement of our mission and vision.

What elements of our business should we outsource? Do our objectives utilize our competitive advantages and core competencies to diminish organizational challenges? We all know the challenges that are both here today and await us just beyond the horizon. Many of them are common to all utilities; an aging workforce, decreased access to capital, technological change and regulatory change. How are we addressing them today and is our approach systematic and proactive or are we simply reacting to the challenges as they arise?

“The strategic planning category examines how your organization develops strategic objectives and action plans. Also examined are how your chosen strategic objectives and action plans are deployed and changed if circumstances require, and how progress is measured [2].”

Customer Focus: The success of the utility industry has been due in part to a long-term positive relationship with its customers. Most utilities have made a conscientious effort to identify and address the needs of the customer; however a new breed of customer is emerging with greater expectations, a higher degree of sensitivity to environmental issues, a diminished sense of loyalty to business organizations and overall suspicion of ethical and legal compliance.

Their preferred means of communication are quite different than the generations of loyal customers you have enjoyed in the past. They judge your performance against similar customer experiences received from organizations far beyond the traditional competitor.

You now compete against Wal-Mart’s supply chain process, Amazon.com’s payment processes and their favorite hotel chain’s loyalty rewards process. You are being weighed in the balances and in many cases found to be lacking. Worse yet, you may not have even recognized them as an emerging customer segment.

“The Customer Focus category examines how your organization engages its customers for long-term marketplace success and builds a customer-focused culture. Also examined is how your organization listens to the voice of its customers and uses this information to improve and identify opportunities for innovation [2].”

Measurement, Analysis, and Knowledge Management: The data created and maintained by GIS, CIS, AMI, SCADA and other systems create a wealth of information that can be analyzed to obtain knowledge sufficient to make rapid business decisions. However, many of these systems are incapable of or at the very least difficult to integrate with one another, leaving leaders with a lot of data but no meaningful measures of key performance. Even worse, a lack of standards related to system performance leaves many utilities that develop performance measures with a limited number of inconsistently measured comparatives from their peers.

If utilities are going to overcome the challenges of the future, it is essential that they integrate all data systems for improved accessibility and develop standards that would facilitate meaningful comparative measures. This is not to say that comparative measures do not exist, they do. However, increasing the number of utilities participating would increase our understanding of best practices and enable us to determine best-in-class performance.

“The measurement, analysis and knowledge management category examines how the organization selects, gathers, analyzes, manages and improves its data, information and knowledge assets and how it manages its information technology. The category also examines how your organization reviews and uses reviews to improve its performance [2].”

Workforce Focus: We have already addressed the aging workforce and its impact on the future of utilities. Companion challenges related to the utility workforce include the heavy benefits burdens that many utilities currently bear. Also, the industry faces a diminished interest in labor positions and the need to establish new training methods to engage a variety of generations within our workforce and ensure knowledge acquisition and retention.

The new workforce brings with it new requirements for satisfaction and engagement. The new employee has proven to be less loyal to the organization and studies show they will have many more employers before they retire than that of their predecessors. It is essential that we develop ways to identify these requirements and take action to retain these individuals or we risk increased training cost and operational issues as they seek new employment opportunities.

“The workforce focus category examines how your organization engages, manages and develops the workforce to utilize its full potential in alignment with organizational mission, strategy and action plans. The category examines the ability to assess workforce capability and capacity needs and to build a workforce environment conducive to high performance [2].”

Process Management: It is not unusual for utilities to implement new software with dramatically increased capabilities and ask the integrator to make it align with their current processes or continue to use their current processes without regard for the system’s new capabilities. Identifying and mapping key work processes can enable incredible opportunities for streamlining your organization and facilitate increased utilization of technology.

What are your utilities’ key work processes and how do you determine them and their relationship to creating customer value? These are difficult for leaders to articulate; but yet, without a clear understanding of key work processes and their alignment to core competencies and strategic advantages as well as challenges, it may be that your organization is misapplying efforts related to core competencies and either outsourcing something best maintained internally or performing effort that is better delivered by outsource providers.

“The process management category examines how your organization designs its work systems and how it designs, manages and improves its key processes for implementing these work systems to deliver customer value and achieve organizational success and sustainability. Also examined is your readiness for emergencies [2].”

Results: Results are the fruit of your efforts, the gift that the Baldrige Criteria enables you to receive from your applied efforts. All of us want positive results. Many utilities cite positive performance in measures that are easy to acquire: financial performance, safety performance, customer satisfaction. But which of these measures are key to our success and sustainability as an organization? As you answer the questions and align measures that are integral to obtaining your organization’s mission and vision, it will become abundantly clear which measures you’ll need to maintain and develop competitive comparisons and benchmarks.

“The results category examines the organization’s performance and improvement in all key areas – product outcomes, customer-focused outcomes, financial and market outcomes, workforce-focused outcomes, process-effectiveness outcomes and leadership outcomes. Performance levels are examined relative to those of competitors and other organizations with similar product offerings [2].”

A Challenge

The adoption of the Baldrige criteria is often described as a journey. Few utilities have embraced this model. However, it appears to offer a comprehensive solution to the challenges we face today. Utilities have a rich history and play a positive role in our nation. A period of rapid change is upon us. We need to shift from reacting to leading as we solve the problems that face our industry. By applying this model for effective process improvement, we can once again create a world where utilities lead the future.

References

  1. Quote from U.S. Treasury Secretary Tim Geithner as communicated in SmartGrid Newsletter
  2. Malcolm Baldrige National Quality Award, “Path to Excellence and Some path Building Tools.” www.nist.gov/baldrige.

Enabling Successful Business Outcomes Through Value-Based Client Relationships

Utilities are facing a host of challenges ranging from environmental concerns, aging infrastructure and systems, to Smart Grid technology and related program decisions. The future utility will be required to find effective solutions to these challenges, while continuing to meet the increasing expectations of newly empowered consumers. Cost management in addressing these challenges is important, but delivery of value is what truly balances efficiency with customer satisfaction.

Our Commitment

Vertex clients trust us to deliver on our promises and commitments, and they partner with us to generate new ideas that will secure their competitive advantage, while also delivering stakeholder benefits. Our innovative same-side-of-the-table approach allows us to transform the efficiency and effectiveness of your business operations, enabling you to lower your risk profile and enhance your reputation in the eyes of customers, investors and regulatory bodies. Working as partners, we provide unique insights that will generate actionable ideas and help you achieve new levels of operational excellence.

With a long heritage in the utility industry, Vertex possesses an in-depth knowledge and understanding of the issues and challenges facing utility businesses today. We actively develop insights and innovative ideas that allow us to work with our utility clients to transform their businesses, and we can enhance your future performance in terms of greater efficiencies, higher customer satisfaction, increased revenue and improved profitability.

Achievement of desired business outcomes is best achieved with a strategic, structured approach that leverages continuous improvement throughout. Vertex takes a four-level approach, which starts with asking the right questions. Levels 1 and 2 identify business challenges and the corresponding outcomes your utility hopes to achieve. Need to improve customer satisfaction? If so, is moving from the 2nd to 1st quartile the right target? Pinpointing the key business challenges that are limiting or impeding your success is critical. These may include a need to reduce bad debt, reduce costs, minimize billing errors, or improve CSR productivity. Whatever challenges you face, collaboration with our experts will ensure your utility is on the right track to meet or exceed your targets.

Once the challenges and outcomes have been identified and validated, Vertex partners with clients to develop effective solutions. The solutions implemented in Level 3 consist of unique value propositions that, when combined effectively, achieve the desired business outcome for the business challenge being addressed. Vertex’s proprietary “Value Creation Model” enables us to develop and implement solutions that provide measurable business results and ongoing quality assurance.

Inherent to the success of this model is the Vertex Transition Methodology, which has resulted in 200 successful transitions over a twelve-year period. Due diligence yields a clear understanding of how the business operates. Mobilizing activities lay the foundation for the transition, and a baseline for the transition plan is established. The plans developed during the planning stage are implemented, followed by a stabilization period from the business transfer to when things are fully operational.

Another key element of this model lies in Vertex’s transformation capabilities, and what we refer to as our “6D” transformation methodology. Dream, Define, Design, Develop, Deliver, Drive – our Lean Six Sigma methods guarantee successful deployment of continuous process improvement results. In addition to Lean Six Sigma, the Vertex Transformation Methodology includes change management, people and performance management, and project management.

In Level 4 of the Vertex solution approach, Vertex measures the effectiveness of a solution by determining if it achieved the desired business outcome. We utilize a Balanced Scorecard approach to ensure that the business outcome positively impacts all of the key elements of a client’s business: Customer, Employee, Operational, and Financial. As desired business outcomes evolve, Vertex will remain committed to adapting our solutions in partnership with our clients to meet these changing needs.

Transforming Your Organization

If you’re ready to transform to an outcomes- based business, Vertex has the capability to help. Our service lines include: Consulting and Transformation, IT Applications Services and Products, Debt Management, and Meter-to-Cash Outsourcing.

Our transformation approach blends innovation and business process improvement, focusing on achieving your strategic objectives via our proven expertise and insights. We bring business transformation that secures greater efficiencies, improved effectiveness and enhanced services for your organization. All the while we never forget that our employees represent your brand.

We’ll work collaboratively with you, rapidly implementing services and delivering on continuous improvement to meet your goals. We’ll build on your business needs, sharing ideas and jointly developing options for change – working together to deliver real value.

Empower Your Customers To Reduce Energy Demand

The Energy Information Administration (EIA) forecasts a continuing gap between total domestic energy production and consumption through 2030. This delta will not be closed by supply alone; customer behavior changes are needed to reduce total consumption and peak load. Electric and gas utilities face tremendous challenges meeting energy supply and demand needs and will play a pivotal role in determining practical solutions. With the right approach, utilities will deliver on the promise of energy efficiency and demand response.

Energy market projections are highly speculative as the market is characterized by high price volatility and rapid market transformation. Adding to the uncertainty is the voluntary nature of demand response and energy efficiency programs, and the critical importance of customer behavior change. Utilities are spending billions of dollars, making program penetration essential – and customer education paramount. At an end-point cost of up to $300, a five percent penetration is not the answer. Vertex can help mitigate these risks through highly effective management of customer care, CIS integration, pilot programs, and analytics. Vertex’s core “meter-to-cash” capabilities have undergone a major revolution in response to the new world of AMI, energy efficiency, and demand response. A robust set of new services will allow utilities to transform how they do business.

Smart meters put new demands on CIS platforms and traditional business processes – innovative rates, distributed generation, demand response and new customer programs all require creative change. Vertex is currently helping utilities develop and manage customer programs to fully exploit smart meter deployments and provide customer care to customers migrating to time-based rates. We deliver customer management services to drive penetration and designed to meet the unique customer care needs generated by smart meter installations, energy efficiency and demand response programs to empower customers to manage their energy use and reduce consumption, and cost-effective customer care and billing solutions to support smart meters.

Water utilities are not immune to the need for conservation. In the past 30 years, the U.S. population has grown over 50% while the total water use has tripled. On average, Americans use approximately 75 to 80 gallons of water per person per day. Vertex can help water utilities address the unique conservation challenges they face, including customer care and program support, MDMS solutions to organize data for forecasting, code enforcement, business and customer insight, and other services.

Case Study – Hydro One

Hydro One is an Ontario, Canada based utility that is one of the five largest transmission utilities in North America. As the stewards of critical provincial assets, Hydro One works with its industry partners to ensure that electricity can be delivered safely, reliably, and affordably to its customers. Vertex has been providing Meter-to-Cash outsourcing services to Hydro One since 2002.

Applying the Vertex 4-level solutions approach enabled desired business outcomes:

Level 1: Identify Business Challenges

In 2006 Hydro One approached Vertex and indicated that one of their corporate goals was to dramatically improve customer satisfaction as a result of the Hydro One customer satisfaction survey. At that point, Hydro One customer satisfaction scores on agent-handled calls had hovered in the 75-76% range for several years. Up to that time, the relationship with Vertex had focused on significant reductions to cost with no erosion to service offered to customers. Now, Hydro One was looking to Vertex to help lead the drive to improve the customer experience.

Level 2: Identify Desired Outcomes

In 2007 Vertex and Hydro One entered into collaborative discussions to evaluate and analyze the historical customer satisfaction scores, and to work jointly to develop a plan to radically modify the customer experience and improve customer satisfaction. Those discussions led down several paths, and the parties mutually agreed to target the following areas for change:

  • The Vertex/Hydro One Quality program
  • A cultural adjustment that would reflect the change in focus
  • Technology that could help support Hydro One’s goals
  • End-to-end process review

Level 3: Develop & Implement Solution

Vertex has worked closely with Hydro One to help them deliver on their goal of significant improvements to customer satisfaction. Changes were applied to process, call scripts, quality measures and performance scoring at all levels in the organization, including incentive compensation and recognition programs.

Level 4: Measure Solution Results

  • Customer satisfaction scores on agent-handled calls increased from 76% in 2006 to 86% in 2008
  • Quality monitoring program changes yielded a 10% increase in first-call resolution
  • Introduced bi-weekly Process/Quality forums
  • Monthly reviews with the client to reinforce success and progress toward targets

Improving Call Center Performance Through Process Enhancements

The great American philosopher Yogi Berra once said, “If you don’t know where you’re going, chances are you will end up somewhere else.” Yet many utilities possess only a limited understanding of their call center operations, which can prevent them from reaching the ultimate goal: improving performance and customer satisfaction, and reducing costs.

Utilities face three key barriers in seeking to improve their call center operations:

  • Call centers routinely collect data on “average” performance, such as average handle time, average speed of answer and average hold time, without delving into the details behind the averages. The risk is that instances of poor and exemplary performance alike are not revealed by such averages.
  • Call centers typically perform quality reviews on less than one-half percent of calls received. Poor performance by individual employees – and perhaps the overall call center – can thus be masked by insufficient data.
  • Calls centers often fail to periodically review their processes. When they do, they frequently lack statistically valid data to perform the reviews. Without detailed knowledge of call center processes, utilities are unlikely to recognize and correct problems.

There are, however, proven methods for overcoming these problems. We advocate a three-step process designed to achieve more effective and efficient call center operations: collect sufficient data; analyze the data; and review and monitor progress on an ongoing basis.

STEP 1: COLLECT SUFFICIENT DATA

The ideal sampling size is 1,000 randomly selected calls. This size call sample typically provides results that are accurate +/- 3 percent, with a more than 90 percent degree of confidence. These are typical levels of accuracy and confidence that businesses require before they are likely to undertake action.

The types of data that should be collected from each call include:

  • Call type, such as new service, emergency, bill payment or high bill, and subcall type.
  • Number of systems and/or screens used – for example, how many screens did it take to complete a new service request?
  • Actions taken during the call, such as greeting the customer, gathering customer- identity data, understanding the problem or delivering the solution.
  • Actions taken after the call – for example, entering data into computer systems, or sending notes or Emails to the customer or contact center colleagues.

Having the right tool can greatly facilitate data collection. For example, the call center data collection tool pictured in Figure 1 captures this information quickly and easily, using three push-button timers that enable accurate data collection.

When a call is being reviewed, the analyst pushes the green buttons to indicate which of 12 different steps within a call sequence is occurring. The steps include greeting, hold and transfer, among others. Similarly, the yellow buttons enable the analyst to collect the time elapsed for each of 15 different screens that may be used and up to 15 actions taken after the call is finished.

This analysis resembles a traditional “time and motion” study, because in many ways it is just that. But the difference here is that we can use new automated tools, such as the voice and screen capture tools and data collector shown, as well as new approaches, to gain new insights.

The data capture tool also enables the analyst to collect up to 100 additional pieces of data, including the “secondary and tertiary call type.” (As an example, a credit call may be the primary call type, a budget billing the secondary call type and a customer in arrears the tertiary call type.) The tool also lets the analyst use drop-down boxes to quickly collect data on transfers, hold time, mistakes made and opportunities noted.

Moreover, this process can be executed quickly. In our experience, it takes four trained employees five days to gather data on 1,000 calls.

STEP 2: ANALYZE THE DATA

Having collected this large amount of data, how do you use the information to reduce costs and improve customer and employee satisfaction? Again, having the right tool enables analysts to easily generate statistics and graphs from the collected data. Figure 2 shows the type of report that can be generated based on the recommended data collection.

The analytic value of Figure 2 is that it addresses the fact that most call center reports focus on “averages” and thus fail to reveal other important details. Figure 2 shows the 1,000 calls by call-handle time. Note that the “average” call took 4.65 minutes; however, many calls took a minute or less, and a disturbingly large number of calls took well over 11 minutes.

Using the captured data, utilities can then analyze what causes problem calls. In this example, we analyzed 5 percent of the calls (49 in total) and identified several problems:

  • Customer service representatives (CSRs) were taking calls for which they were inadequately trained, causing high hold times and inordinately large screen usage numbers.
  • IT systems were slow on one particular call type.
  • There were no procedures in place to intercede when an employee took more than a specified number of minutes to complete a call.
  • Procedures were laborious, due to Public Utilities Commission (PUC) regulations or – more likely – internally mandated rules.

This kind of analysis, which we describe as a “longest call” review, typically helps identify problems that can be resolved at minimal cost. In fact, our experience in utility and other call centers confirms that this kind of analysis often allows companies to cut call-handle time by 10 to 15 seconds.

It’s important to understand what 10 to 15 fewer seconds of call-handle time means to the call center – and, most importantly, to customers. For a typical utility call center with 200 or more CSRs, the shorter handle time can result in a 5 percent cost reduction, or roughly $1 million annually. Companies that can comprehend the economic value and customer satisfaction associated with reducing average handle time, even by one second, are likely to be better focused on solving problems and prioritizing solutions.

Surprisingly, the longest 5 percent of calls typically represent nearly 15 percent of the total call center handle time, representing a mother lode of opportunity for improvement.

Another important benefit that can result from this detailed examination of call center sampling data involves looking at hold time. A sample hold time analysis graph is pictured in Figure 3.

Excessive hold times tend to be caused by bad call routing, lengthy notes on file, unclear processes and customer issues. Each of these problems has a solution, usually low-cost and easily implemented. Most importantly, the value of each action is quantified and understood, based on the data collected.

Other useful questions to ask include:

  • What are the details behind the high average after-call work (ACW) time? How does this affect your call center costs?
  • How would it help budget discussions with IT if you knew the impact of such things as inefficient call routing, poor integrated voice response (IVR) scripts or low screen pop percentages?
  • What analyses can you perform to understand how you should improve training courses and focus your quality review efforts?

The output of these analyses can prove invaluable in budget discussions and in prioritizing improvement efforts, and is also useful in communicating proposals to senior management, CSRs, quality review staff, customers and external organizations. The data can also be the starting point for a Six Sigma review.

Utilities can frequently achieve a 20 percent cost reduction by collecting the right data and analyzing it at a sufficiently granular level. Following is a breakdown of the potential savings:

  • Three percent savings can be achieved by reducing longest calls by 10 seconds.
  • Five percent savings can be gained by reducing ACW by 15 seconds.
  • Five percent savings can be realized by improving call routing – usually by aligning CSR skills required with CSR skills available – by 15 seconds.
  • Three percent savings can be achieved by improving process for two frequent processes by 10 seconds each.
  • Three percent savings can be realized by improving IVR and screen pop frequency and quality of information by 10 seconds.
  • One percent savings can be gained by improving IT response time on selected screens by three seconds.

STEP 3: REVIEW AND MONITOR PROGRESS ON AN ONGOING BASIS

Although this white paper focuses on the data collection and analyses procedures used, the key difference in this approach is the optimization strategy behind it.

The two-step approach outlined above starts with utilities recognizing that improvement opportunities exist, understanding the value of detailed data in identifying these opportunities and enabling the data collected to be easily presented and reviewed. Taken as a whole, this process can produce prioritized, high-ROI recommendations.

To gain the full value of this approach, utilities should do the following:

  • Engage the quality review team, trainers, supervisors and CSRs in the review process;
  • Expand the focus of the quality review team from looking only at individual CSRs’ performance to looking at organizational processes as well;
  • Have trainers embed the new lessons learned in training classes;
  • Encourage supervisors to reinforce lessons learned in team meetings and one-on-one coaching; and
  • Require CSRs to identify issues that can be studied in future reviews and follow the lessons learned.

Leading organizations perform these reviews periodically, building on their understanding of their call centers’ current status and using that understanding to formulate actions for future improvement.

Once the first study is complete, utilities also have a benchmark to which results from future studies can be compared. The value of having these prior analyses should be obvious in each succeeding review, as hold times decline, average handle times decrease, calls are routed more frequently to the properly skilled person and IT investments made based on ROI analyses begin to yield benefits.

Beyond these savings, customer and employee satisfaction should increase. When a call is routed to the CSR with the requisite skills needed to handle it, both the customer and the CSR are happier. Customer and CSR frustration will also be reduced when there are clear procedures to escalate calls, and IT systems fail less frequently.

IMPLEMENTING A CALL CENTER REVIEW

Although there are some commonalities in improving utilities’ call center performance, there are always unique findings specific to a given call center that help define the nature and volume of opportunities, as well as help chart the path to improvement.

By realizing that benefit opportunities exist and applying the process steps described above, and by using appropriate tools to reduce costs and improve customer and CSR satisfaction, utilities have the opportunity to transform the effectiveness of their call centers.

Perhaps we should end with another quote from Yogi: “The future ain’t what it used to be.” In fact, for utilities that implement these steps, the future will likely be much better.

Smart Metering Options for Electric and Gas Utilities

Should utilities replace current consumption meters with “smart metering” systems that provide more information to both utilities and customers? Increasingly, the answer is yes. Today, utilities and customers are beginning to see the advantages of metering systems that provide:

  • Two-way communication between the utility and the meter; and
  • Measurement that goes beyond a single consolidated quarterly or monthly consumption total to include time-of-use and interval measurement.

For many, “smart metering” is synonymous with an advanced metering infrastructure (AMI) that collects, processes and distributes metered data effectively across the entire utility as well as to the customer base (Figure 1).

SMART METERING REVOLUTIONIZES UTILITY REVENUE AND SERVICE POTENTIAL

When strategically evaluated and deployed, smart metering can deliver a wide variety of benefits to utilities.

Financial Benefits

  • Significantly speeds cash flow and associated earnings on revenue. Smart metering permits utilities to read meters and send the data directly to the billing application. Bills go out immediately, cutting days off the meter-to-cash cycle.
  • Improves return on investment via faster processing of final bills. Customers can request disconnects as the moving van pulls away. Smart metering polls the meter and gives the customer the amount of the final bill. Online or credit card payments effectively transform final bill collection cycles from a matter of weeks to a matter of seconds.
  • Reduces bad debt. Smart metering helps prevent bad debt by facilitating the use of prepayment meters. It also reduces the size of overdue bills by enabling remote disconnects, which do not depend on crew availability.

Operational Cost Reductions

  • Slashes the cost to connect and disconnect customers. Smart metering can virtually eliminate the costs of field crews and vehicles previously required to change service from the old to the new residents of a metered property.
  • Lowers insurance and legal costs. Field crew insurance costs are high – and they’re even higher for employees subject to stress and injury while disconnecting customers with past-due bills. Remote disconnects through smart metering lower these costs. They also reduce medical leave, disability pay and compensation claims. Remote disconnects also significantly cut the number of days that employees and lawyers spend on perpetrator prosecutions and attempts to recoup damages.
  • Cuts the costs of managing vegetation. Smart metering can pinpoint blinkouts, reducing the cost of unnecessary tree trimming.
  • Reduces grid-related capital expenses. With smart metering, network managers can analyze and improve block-by-block power flows. Distribution planners can better size transformers. Engineers can identify and resolve bottlenecks and other inefficiencies. The benefits include increased throughput and reductions in grid overbuilding.
  • Shaves supply costs. Supply managers use interval data to fine-tune supply portfolios. Because smart metering enables more efficient procurement and delivery, supply costs decline.
  • Cuts fuel costs. Many utility service calls are “false alarms.” Checking meter status before dispatching crews prevents many unnecessary truck rolls. Reduces theft. Smart metering can identify illegal attempts to reconnect meters, or to use energy and water in supposedly vacant premises. It can also detect theft by comparing flows through a valve or transformer with billed consumption.

Compliance Monitoring

  • Ensures contract compliance. Gas utilities can use one-hour interval meters to monitor compliance from interruptible, or “non-core,” customers and to levy fines against contract violators.
  • Ensures regulatory compliance. Utilities can monitor the compliance of customers with significant outdoor lighting by comparing similar intervals before and during a restricted time period. For example, a jurisdiction near a wildlife area might order customers to turn off outdoor lighting so as to promote breeding and species survival.
  • Reduces outage duration by identifying outages more quickly and pinpointing outage and nested outage locations. Smart metering also permits utilities to ensure outage resolution at every meter location.
  • Sizes outages more accurately. Utilities can ensure that they dispatch crews with the skills needed – and adequate numbers of personnel – to handle a specific job.
  • Provides updates on outage location and expected duration. Smart metering helps call centers inform customers about the timing of service restoration. It also facilitates display of outage maps for customer and public service use.
  • Detect voltage fluctuations. Smart metering can gather and report voltage data. Customer satisfaction rises with rapid resolution of voltage issues.

New Services

For utilities that offer services besides commodity delivery, smart metering provides an entry to such new business opportunities as:

  • Monitoring properties. Landlords reduce costs of vacant properties when utilities notify them of unexpected energy or water consumption. Utilities can perform similar services for owners of vacation properties or the adult children of aging parents.
  • Monitoring equipment. Power-use patterns can reveal a need for equipment maintenance. Smart metering enables utilities to alert owners or managers to a need for maintenance or replacement.
  • Facilitating home and small-business networks. Smart metering can provide a gateway to equipment networks that automate control or permit owners to access equipment remotely. Smart metering also facilitates net metering, offering some utilities a path toward involvement in small-scale solar or wind generation.

Environmental Improvements

Many of the smart metering benefits listed above include obvious environmental benefits. When smart metering lowers a utility’s fuel consumption or slows grid expansion, cleaner air and a better preserved landscape result. Smart metering also facilitates conservation through:

  • Leak detection. When interval reads identify premises where water or gas consumption never drops to zero, leaks are an obvious suspect.
  • Demand response and critical peak pricing. Demand response encourages more complete use of existing base power. Employed in conjunction with critical peak pricing, it also reduces peak usage, lowering needs for new generators and transmission corridors.
  • Load control. With the consent of the owner, smart metering permits utilities or other third parties to reduce energy use inside a home or office under defined circumstances.

CHALLENGES IN SMART METERING

Utilities preparing to deploy smart metering systems need to consider these important factors:

System Intelligence. There’s a continuing debate in the utility industry as to whether smart metering intelligence should be distributed or centralized. Initial discussions of advanced metering tended to assume intelligence embedded in meters. Distributed intelligence seemed part of a trend, comparable to “smart cards,” “smart locks” and scores of other everyday devices with embedded computing power.

Today, industry consensus favors centralized intelligence. Why? Because while data processing for purposes of interval billing can take place in either distributed or central locations, other applications for interval data and related communications systems cannot. In fact, utilities that opt for processing data at the meter frequently make it impossible to realize a number of the benefits listed above.

Data Volume. Smart metering inevitably increases the amount of meter data that utilities must handle. In the residential arena, for instance, using hour-long measurement intervals rather than monthly consumption totals replaces 12 annual reads per customer with 8,760 reads – a 730-fold increase.

In most utilities today, billing departments “own” metering data. Interval meter reads, however, are useful to many departments. These readings can provide information on load size and shape – data that can then be analyzed to help reduce generation and supply portfolio costs. Interval reads are even more valuable when combined with metering features like two-way communication between meter and utility, voltage monitoring and “last gasp” messages that signal outages.

This new data provides departments outside billing with an information treasure trove. But when billing departments control the data, others frequently must wait for access lest they risk slowing down billing to a point that damages revenue flow.

Meter Data Management. An alternative way to handle data volume and multiple data requests is to offload it into a stand-alone meter data management (MDM) application.

MDM applications gather and store meter data. They can also perform the preliminary processing required for different departments and programs. Most important, MDM gives all units equal access to commonly held meter data resources (Figure 2).

MDM provides an easy pathway between data and the multiple applications and departments that need it. Utilities can more easily consolidate and integrate data from multiple meter types, and reduce the cost of building and maintaining application interfaces. Finally, MDM provides a place to store and use data, whose flow into the system cannot be regulated – for example, in situations such as the flood of almost simultaneous messages from tens of thousands of meters sending a “last gasp” during a major outage.

WEIGHING THE COSTS AND BENEFITS OF SMART METERING

Smart metering on a mass scale is relatively new. No utility can answer all questions in advance. There are ways, however, to mitigate the risks:

Consider all potential benefits. Smart metering may be a difficult cost to justify if it rests solely on customer acceptance of demand response. Smart metering is easier to cost-justify when its deployment includes, for instance, the value of the many benefits listed above.

Evaluate pilots. Technology publications are full of stories about successful pilots followed by unsuccessful products. That’s because pilots frequently protect participants from harsh financial consequences. And it’s difficult for utility personnel to avoid spending time and attention on participants in ways that encourage them to buy into the program. Real-life program rollouts lack these elements.

Complicating the problem are likely differences between long-term and short-term behavior. The history of gasoline conservation programs suggests that while consumers initially embrace incentives to car pool or use public transportation, few make such changes on a permanent basis.

Examining the experiences of utilities in the smart metering forefront – in Italy, for example, or in California and Idaho – may provide more information than a pilot.

Develop a complete business case. Determining the cost-benefit ratio of smart metering is challenging. Some costs – for example, meter prices and installation charges – may be relatively easy to determine. Others require careful calculations. As an example, when interval meters replace time-of-use meters, how does the higher cost of interval meters weigh against the fact that they don’t require time-of-use manual reprogramming?

As in any business case, some costs must be estimated:

  • Will customer sign-up equal the number needed to break even?
  • How long will the new meters last?
  • Do current meter readers need to be retrained, and if so, what will that cost?
  • Will smart metering help retain customers that might otherwise be lost?
  • Can new services such as equipment efficiency analyses be offered, and if so, how much should the utility charge for them?

Since some utilities are already rolling out smart metering programs, it’s becoming easier to obtain real-life numbers (rather than estimates) to plug into your business case.

CONSIDER ALTERNATIVES

Technology is “smart” only when it reduces the cost of obtaining specified objectives. Utilities may find it valuable to try lower-cost routes to some results, including:

  • Customer charges to prevent unnecessary truck rolls. Such fees are common among telephone service providers and have worked well for some gas utilities responding to repeated false alarms from householder-installed carbon monoxide detectors.
  • Time-of-use billing with time/rate relationships that remain constant for a year or more. This gives consumers opportunities to make time-shifting a habit.
  • Customer education to encourage consumers to use the time-shifting features on their appliances as a contribution to the environment. Most consumers have no idea that electricity goes to waste at night. Keeping emissions out of the air and transmission towers out of the landscape could be far more compelling to many consumers than a relatively small saving resulting from an on- and off-peak pricing differential.
  • Month-to-month rate variability. One study found that approximately a third of the efficiency gains from real-time interval pricing could be captured by simply varying the flat retail rates monthly – and at no additional cost for metering. [1] While a third of the efficiency gains might not be enough to attain long-term goals, they might be enough to fill in a shorter-term deficit, permitting technology costs and regulatory climates to stabilize before decisions must be made.
  • Multitier pricing based on consumption. Today, two-tier pricing – that is, a lower rate for the first few-hundred kilowatt-hours per month and a higher rate for additional hours – is common. However, three or four tiers might better capture the attention of those whose consumption is particularly high – owners of large homes and pool heaters, for instance – without burdening those at the lower end of the economic ladder. Tiers plus exception handling for hardships like high-consuming medical equipment would almost certainly be less difficult and expensive than universal interval metering.

A thorough evaluation of the benefits and challenges of advanced metering systems, along with an understanding of alternative means to achieving those benefits, is essential to utilities considering deployment of advanced metering systems.

Note: The preceding was excerpted from the Oracle white paper “Smart Metering for Electric and Gas Utilities.” To receive the complete paper, Email oracleutilities_ww@oracle.com.

ENDNOTE

  1. Holland and Mansur, “The Distributional and Environmental Effects of Time-varying Prices in Competitive Electricity Markets.” Results published in “If RTP Is So Great, Why Don’t We See More of It?” Center for the Study of Energy Markets Research Review, University of California Energy Institute, Spring 2006. Available at www.ucei.berkeley.edu/

Skinflint Search Marketing

I admit it – I’m a skinflint. Call me a tightwad, a miser – I don’t care. Basically, I’m cheap. And even if you’re not cheap by personality, you might need to conserve cash by necessity. If that’s your situation, don’t despair. The Internet is tailor-made for you. Internet marketing, and search marketing in particular, is the land of the free. So step up, you skinflints, and let’s see what you can do for nothing.

Organic search is always free, in the same sense that public relations efforts are free – you don’t pay anyone to run advertising to get your message out there. Instead, you come up with a good story and run it by the gatekeepers – the ones between you and your target markets.

For public relations, the gatekeepers are reporters, editors and other folks with their grip on the media that your audience consumes. It doesn’t cost you any money to get coverage in these media outlets, but it definitely costs time and ingenuity to come up with an idea and persuade the gatekeepers to pass it through.

Organic search marketing has the same elements as public relations, except the gatekeepers are Google and the other search engines. You must “persuade” the search engines to show your story – by giving it a high ranking for a search keyword – before it reaches your audience. That’s a big part of what organic search marketing is all about.

The problem is that organic search requires so much work that you’re tempted to automate a lot of it. That’s where the costs can come in.

Can Free Search Optimization Tools Be Enough?

As with many questions, the answer to whether free tools will be enough for your search campaigns is, “it depends.” What’s clear to me, however, is that free tools are the place to start. It’s best to see how far you can go with the free thing before you lay out a bundle of cash for a high-end tool.

We don’t have room in this article to list all the leading freebies, but let’s look at some of what’s out there. You can find a more comprehensive treatment on my website (at www.mikemoran.com/skinflint) with links to these tools and more.

Forecast your campaign. Good direct marketing principles start by identifying the criteria for success. My website has a free spreadsheet that helps you identify the value of search marketing, even before you begin your campaign. You can project your extra traffic and see how much more revenue it brings – just the thing to justify your plans to the boss.

Get your pages indexed. If your pages aren’t indexed, they’ll never be found. You can use MarketLeap’s free Saturation Tool to check how many pages you’ve got indexed on the leading search engines and then use the free Sitemaps protocol to get more of your pages indexed. You can also use free tools to check your robots settings and validate your HTML, helping you eliminate some common causes of pages being ignored by spiders.

Plan your keywords. If you don’t know what your audience is looking for, you can’t tune your pages to be found for the right words. For years Yahoo’s Keyword Selector Tool was the best free offering, but it spent most of 2007 showing January’s numbers when you’d expect updates each month. Trellian jumped into the void with a free version of its Keyword Discovery tool that helps you find keyword variations along with the search volume you can expect for each one.

Optimize your page content. Analyze your keyword density (the percentage of keywords in your content) and keyword prominence (the importance of the places where they appear) with free tools from Ranks and WebCEO. The results can help you decide how to change your pages to improve your rankings.

Attract links from other sites. Use Backlinkwatch or PRWeaver to analyze the links to your site and to identify where you might prospect for more. The results can form the start of a link-building campaign if you carefully approach the right people with valuable content on your site that their readers care about.

Measure your results. Use free rank checkers from Digital Point and Mike’s Marketing Tools to see where you stand. Then use Google Analytics or the Deep Log Analyzer to count the traffic from search engines keyword by keyword. Google Analytics can also measure your conversions – the number of folks who bought from you or responded positively in some other way.

Will these free tools work in every situation? No. Some tools are limited in scope or in the volume they can handle, and many are limited in features. Perhaps the biggest drawback of free tools is lack of integration – you’ll need to manage all of these free tools and often move data back and forth between them to manage your campaign. It ain’t seamless. But what do you want for nothing?

If you do need to move up in class, some of these free tools are actually the starter versions of more comprehensive fee-based offerings. Regardless, you’ll have gained valuable experience in using the free tools that will help you target the exact features that you need to pay for when you decide to take the plunge to spend money for a tool.

Free Paid Search

I know that “free paid search” sounds like an oxymoron (or perhaps an oxyMoran when I say it), but there are a few free ways to get paid search traffic.

One way is to submit your product to Google Base (you’ll show up on Google Product Search also). Neither of these properties produce a huge number of sales – other product search sites (the ones you pay for) are the leaders in this space – but there’s a lot to be said for free revenue. You might try out your shopping search feeds on these sites and open your wallet to the big guys when you have worked out the kinks in your content.

Another free way to do paid search is to use other people’s money. Can you steal some money for paid search from the sales budget or from other marketing budgets inside your company? Can you work on cooperative advertising with a complementary product? Perhaps if you agree to run the paid search campaign, you can get others to foot the bill.

Regardless of how you do it, search marketing is ideal for marketers with empty pockets. See my website (www.mikemoran.com/skinflint) where you’ll find more free ideas for doing search marketing, plus links to the tools described here. You’ll also see how to apply the skinflint approach to other kinds of Internet marketing campaigns. And every idea is your favorite price: free.

Mike Moran is an IBM Distinguished Engineer and product manager for IBM’s OmniFind search product. Mike’s books include Search Engine Marketing, Inc. and Do It Wrong Quickly. He can be reached through his website (mikemoran.com).

Leveraging the Data Deluge: Integrated Intelligent Utility Network

If you define a machine as a series of interconnected parts serving a unified purpose, the electric power grid is arguably the world’s largest machine. The next-generation version of the electric power grid – called the intelligent utility network (IUN), the smart grid or the intelligent grid, depending on your nationality or information source – provides utilities with enhanced transparency into grid operations.

Considering the geographic and logical scale of the electric grid from any one utility’s point of view, a tremendous amount of data will be generated by the additional “sensing” of the workings of the grid provided by the IUN. This output is often described as a “data flood,” and the implication that businesses could drown in it is apropos. For that reason, utility business managers and engineers need analytical tools to keep their heads above water and obtain insight from all this data. Paraphrasing the psychologist Abraham Maslow, the “hierarchy of needs” for applying analytics to make sense of this data flood could be represented as follows (Figure 1).

  • Insight represents decisions made based on analytics calculated using new sensor data integrated with existing sensor or quasi-static data.
  • Knowledge means understanding what the data means in the context of other information.
  • Information means understanding precisely what the data measures.
  • Data represents the essential reading of a parameter – often a physical parameter.

In order to reap the benefits of accessing the higher levels of this hierarchy, utilities must apply the correct analytics to the relevant data. One essential element is integrating the new IUN data with other data over the various time dimensions. Indeed, it is analytics that allow utilities to truly benefit from the enhanced capabilities of the IUN compared to the traditional electric power grid. Analytics can be comprised solely of calculations (such as measuring reactive power), or they can be rule-based (such as rating a transform as “stressed” if it has a more than 120 percent nameplate rating over a two-hour period).

The data to be analyzed comes from multiple sources. Utilities have for years had supervisory control and data acquisition (SCADA) systems in place that employ technologies to transmit voltage, current, watts, volt ampere reactives (VARs) and phase angle via leased telephone lines at 9,600 baud, using the distributed network protocol (DNP3). Utilities still need to integrate this basic information from these systems.

In addition, modern electrical power equipment often comes with embedded microprocessors capable of generating useful non-operational information. This can include switch closing time, transformer oil chemistry and arc durations. These pieces of equipment – generically called intelligent electrical devices (IEDs) – often have local high-speed sequences of event recorders that can be programmed to deliver even more data for a report for post-event analysis.

An increasing number of utilities are beginning to see the business cases for implementing an advanced metering infrastructure (AMI). A large-scale deployment of such meters would also function as a fine-grained edge sensor system for the distribution network, providing not only consumption but voltage, power quality and load phase angle information. In addition, an AMI can be a strategic platform for initiating a program of demand-response load control. Indeed, some innovative utilities are considering two-way AMI meters to include a wireless connection such as Zigbee to the consumer’s home automation network (HAN), providing even finer detail to load usage and potential controllability.

Companies must find ways to analyze all this data, both from explicit sources such as IEDs and implicit sources such as AMI or geographical information systems (GIS). A crucial aspect of IUN analysis is the ability to integrate conventional database data with time-synchronized data, since an isolated analytic may be less useful than no analytic data at all.

CATEGORIES AND RELATIONSHIPS

There are many different categories of analytics that address the specific needs of the electric power utility in dealing with the data deluge presented by the IUN. Some depend on the state regulatory environment, which not only imposes operational constraints on utilities but also determines the scope and effect of what analytics information exchange is required. For example, a generation-to-distribution utility – what some fossil plant owners call “fire to wire” – may have system-wide analytics that link in load dispatch to generation economics, transmission line realities and distribution customer load profiles. Other utilities operate power lines only, and may not have their own generation capabilities or interact with consumers at all. Utilities like these may choose to focus initially on distribution analytics such as outage predication and fault location.

Even the term analytics can have different meanings for different people. To the power system engineer it involves phase angles, voltage support from capacitor banks and equations that take the form “a + j*b.” To the line-of-business manager, integrated analytics may include customer revenue assurance, lifetime stress analysis of expensive transformers and dashboard analytics driving business process models. Customer service executives could use analytics to derive emergency load control measures based on a definition of fairness that could become quite complex. But perhaps the best general definition of analytics comes from the Six Sigma process mantra of “define, measure, analyze, improve, control.” In the computer-driven IUN, this would involve:

  • Define. This involves sensor selection and location.
  • Measure. SCADA systems enable this process.
  • Analyze. This can be achieved using IUN Analytics.
  • Improve. This involves grid performance optimization, as well as business process enhancements.
  • Control. This is achieved by sending commands back to grid devices via SCADA, and by business process monitoring.

The term optimization can also be interpreted in several ways. Utilities can attempt to optimize key performance indicators (KPIs) such as the system average interruption duration index (SAIDI, which is somewhat consumer-oriented) on grid efficiency in terms of megawatts lost to component heating, business processes (such as minimizing outage time to repair) or meeting energy demand with minimum incremental fuel cost.

Although optimization issues often cross departmental boundaries, utilities may make compromises for the sake of achieving an overall strategic goal that can seem elusive or even run counter to individual financial incentives. An important part of higher-level optimization – in a business sense rather than a mathematical one – is the need for a utility to document its enterprise functions using true business process modeling tools. These are essential to finding better application integration strategies. That way, the business can monitor the advisories from analytics in the tool itself, and more easily identify business process changes suggested by patterns of online analytics.

Another aspect of IUN analytics involves – using a favorite television news phrase – “connecting the dots.” This means ensuring that a utility actually realizes the impact of a series of events on an end state, even though the individual events may appear unrelated.

For example, take complex event processing (CEP). A “simple” event might involve a credit card company’s software verifying that your credit card balance is under the limit before sending an authorization to the merchant. A “complex” event would take place if a transaction request for a given credit card account was made at a store in Boston, and another request an hour later in Chicago. After taking in account certain realities of time and distance, the software would take an action other than approval – such as instructing the merchant to verify the cardholder’s identity.

Back in the utilities world, consideration of weather forecasts in demand-response action planning, or distribution circuit redundancy in the face of certain existing faults, can be handled by such software. The key in developing these analytics is not so much about establishing valid mathematical relationships as it is about giving a businessperson the capability to create and define rules. These rules must be formulated within an integrated set of systems that support cross-functional information. Ultimately, it is the businessperson who relates the analytics back to business processes.

AVAILABLE TOOLS

Time can be a critical variable in successfully using analytics. In some cases, utilities require analytics to be responsive to the electric power grid’s need to input, calculate and output in an actionable time frame.

Utilities often have analytics built into functions in their distribution management or energy management systems, as well as individual analytic applications, both commercial and home-grown. And some utilities are still making certain decisions by importing data into a spreadsheet and using a self-developed algorithm. No matter what the source, the architecture of the analytics system should provide a non-real-time “bus,” often a service-oriented architecture (SOA) or Web services interface, but also a more time-dependent data bus that supports common industry tools used for desktop analytics within the power industry.

It’s important that everyone in the utility has internally published standards for interconnecting their analytics to the buses, so all authorized stakeholders can access it. Utilities should also set enterprise policy for special connectors, manual entry and duplication of data, otherwise known as SOA governance.

The easier it is for utilities to use the IUN data, the less likely it is that their engineering, operations and maintenance staffs will be overwhelmed by the task of actually acquiring the data. Although the term “plug and play” has taken on certain negative connotations – largely due to the fact that few plug-and-play devices actually do that – the principle of easily adding a tool is still both valid and valuable. New instances of IUN can even include Web 2.0 characteristics for the purpose of mash-ups – easily configurable software modules that link, without pain, via Web services.

THE GOAL OF IMPLEMENTING ANALYTICS

Utilities benefit from applying analytics by making the best use of integrated utility enterprise information and data models, and unlocking employee ideas or hypotheses about ways to improve operations. Often, analytics are also useful in helping employees identify suspicious relationships between data. The widely lamented “aging workforce” issue typically involves the loss of senior staff who can visualize relationships that aren’t formally captured, and who were able to make connections that others didn’t see. Higher-level analytics can partly offset the impact of the aging workforce brain drain.

Another type of analytics is commonly called “business intelligence.” But although a number of best-selling general-purpose BI tools are commercially available, utilities need to ensure that the tools have access to the correct, unique, authoritative data. Upon first installing BI software, there’s sometimes a tendency among new users to quickly assemble a highly visual dashboard – without regard to the integrity of the data they’re importing into the tool.

Utilities should also create enterprise data models and data dictionaries to ensure the accuracy of the information being disseminated throughout the organization. After all, utilities frequently use analytics to create reports that summarize data at a high level. Yet some fault detection schemes – such as identifying problems in buried cables – may need original, detailed source data. For that reason utilities must have an enterprise data governance scheme in place.

In newer systems, data dictionaries and models can be provided by a Web service. But even if the dictionary consists of an intermediate lookup table in a relational database, the principles still hold: Every process and calculated variable must have a non-ambiguous name, a cross-reference to other major systems (such as a distribution management system [DMS] or geographic information system [GIS]), a pointer to the data source and the name of the person who owns the data. It is critical for utilities to assign responsibility for data accuracy, validation, source and caveats at the beginning of the analytics engineering process. Finding data faults after they contribute to less-than-correct results from the analytics is of little use. Utilities may find data scrubbing and cross-validation tools from the IT industry to be useful where massive amounts of data are involved.

Utilities have traditionally used simulation primarily as a planning tool. However, with the continued application of Moore’s law, the ability to feed a power system simulation with real-time data and solve a state estimation in real time can result in an affordable crystal ball for predicting problems, finding anomalies or performing emergency problem solving.

THE IMPORTANCE OF STANDARDS

The emergence of industry-wide standards is making analytics easier to deploy across utility companies. Standards also help ease the path to integration. After all, most electrons look the same around the world, and the standards arising from the efforts of Kirchoff, Tesla and Maxwell have been broadly adopted globally. (Contrary views from the quantum mechanics community will not be discussed here!) Indeed, having a documented, self-describing data model is important for any utility hoping to make enterprise-wide use of data for analytics; using an industry-standard data model makes the analytics more easily shareable. In an age of greater grid interconnection, more mergers and acquisitions, and staff shortages, utilities’ ability to reuse and share analytics and create tools on top of standards-based data models has become increasingly important.

Standards are also important when interfacing to existing utility systems. Although the IUN may be new, data on existing grid apparatus and layout may be decades old. By combining the newly added grid observations with the existing static system information to form a complete integration scenario, utilities can leverage analytics much more effectively.

When deploying an IUN, there can be a tendency to use just the newer, sensor-derived data to make decisions, because one knows where it is and how to access it. But using standardized data models makes incorporating existing data less of an issue. There is nothing wrong with creating new data models for older data.

CONCLUSION

To understand the importance of analytics in relation to the IUN, imagine an ice-cream model (pick your favorite flavor). At the lowest level we have data: the ice cream is 30 degrees. At the next level we have information: you know that it is 30 degrees on the surface of the ice cream, and that it will start melting at 32 degrees. At the next level we have knowledge: you’re measuring the temperature of the middle scoop of a three-scoop cone, and therefore when it melts, the entire structure will collapse. At the insight level we bring in other knowledge – such as that the ambient air temperature is 80 degrees, and that the surface temperature of the ice cream has been rising 0.5 degrees per minute since you purchased it. Then the gastronomic analytics activate and take preemptive action, causing you to eat the whole cone in one bite, because the temporary frozen-teeth phenomenon is less of a business risk than having the scoops melt and fault to ground.

Search Marketing Is Direct Marketing

When I say the word “marketing,” what do you think of? Probably some kind of advertising – maybe a TV commercial for Coke. That’s brand marketing, and it’s gotten the lion’s share of attention from marketers for decades.

Far fewer people are direct marketers – the folks behind the catalogs and mail solicitations that fill our mailboxes. If you know any direct marketers, you may want to hire them to run your search marketing campaigns. Let’s look at the basics of direct marketing to find out why.

The Name of the Game Is Response

Direct marketing is truly measurable marketing. Unlike most TV commercials, every direct marketing message is designed to evoke a response, such as “call this number now” or “mail your order form today.” The return on direct marketing investment is based on how many customers respond to those messages. A very successful direct marketing campaign might sport a 4 percent response rate; a failure, less than one-half of 1 percent. Direct marketers make their money by increasing response rates.

Think about it. It doesn’t cost any more to mail a catalog that drives 4 percent response as one that drives 2 percent. The creative costs, paper costs, printing costs and mailing costs are about the same for each mailing, so smart direct marketers focus on raising response to bring more return from the same investment. Direct marketers spend their time figuring out just what causes more people to respond. A different offer on the outside of the envelope might get more people to open it. A different picture and product description in a catalog might cause more people to order. A yellow sticky that says, “Before you pass on our offer, read this” might cause a few people to do just that.

But how do direct marketers know what worked? They measure the response. They measure changes in response to every small variant of their sales pitch. And they keep the changes that work and throw the rest away.

When credit card marketers send out a million pieces of mail to sign up new customers, they don’t just write a letter and mail it out. Instead they write 10 or 20 different letters and mail them to 1,000 people each. Then they mail the version of the letter that generated the best response to the rest of that million-person list.

Direct marketers constantly tweak their messages to become more persuasive. They continuously experiment with new ideas. It may seem picayune to focus on raising response rates from 2.2 percent to 2.6 percent, but just such increases mark breakthrough direct marketing campaigns.

Another way to increase return is to cull your mailing list. If you know that certain customers never seem to buy, you can eliminate those addresses from the list and add new ones that might prove more profitable. Your mailing costs are the same, but your responses will go up.

You can see that the basics of direct marketing revolve around experimenting with your messages and your mailing list to drive more and more sales for the same cost. You can apply those basics to Web marketing, too.

Web marketing, done well, is the biggest direct marketing opportunity ever, because the Web is infinitely more measurable than off-line direct marketing. Off-line direct marketers can measure only the final response – the mail order or the phone call, for example. They can’t tell the difference between those who threw the envelope away without opening it and those who read the entire message but still did not respond. If they could, they’d know whether to change the message on the outside of the envelope or change the letter itself.

The kind of measurement the Web offers is the stuff of direct marketers’ dreams.

Do Your Metrics Measure Up?

Steve DiPietro is amazed at how frequently he listens to prospective clients parroting clickthrough percentages, Web traffic statistics and conversion ratios with great enthusiasm but little-to-no understanding of their value to their organizations. Increasing a conversion rate from 12 to 15 percent can become a goal unto itself as marketers immersed in number crunching can lose sight of the fact that sales aren’t also growing.

Making Sense of Metrics

ALGORITHM: A set of mathematical equations or rules that a search engine uses to rank the content contained within its index in response to a particular search query.

ANALYTICS: Technology that helps analyze the performance of a website or online marketing campaign.

BENCHMARK REPORT: A report used to market where a website falls on a search engine’s results page for a list of keywords. Subsequent search engine position reports are compared with that.

CHARGEBACK:An incomplete sales transaction that results in an affiliate commission deduction. For example: merchandise is purchased and then returned.

CLICK & BYE: The process in which an affiliate loses a visitor to the merchant’s site once they click on a merchant’s banner or text link.

CLICKTHROUGH: The process of activating a link, usually on an online advertisement connecting to the advertiser’s website or landing page.

CLICKTHROUGH RATE (CTR): The percentage of those clicking on links out of the total number who see the links. For example: If 20 people do a Web search and 10 of those 20 people all choose one particular link, that link has a 50 percent clickthrough rate.

CONVERSION RATE: The percentage of clicks that result in a commissionable activity such as a sale or lead.

CONVERSION REPORTING: A measurement for tracking conversions and lead generation from search engine queries. It identifies the originating search engine, keywords, specific landing pages entered and the related conversion for each.

HIT: Request from a Web server for a graphic or other element to be displayed on a Web page.

IMPRESSION: An advertising metric that indicates how many times an advertising link is displayed.

KEYWORD: The word(s) a searcher enters into a search engine’s search box. Also the term that the marketer hopes users will search on to find a particular page.

PAGE VIEW: This occurs each time a visitor views a Web page, irrespective of how many hits are generated. Page views are comprised of files.

RANK: How well a particular Web page or website is listed in a search engine’s results.

UNIQUE VISITORS: Individuals who visited a site during the report period – usually 30 days. If someone visits more than once, they are counted only the first time they visit.

“It’s sad and somewhat surprising that after all this time there is a pervasive lack of understanding … of how these numbers correlate with how to make money,” says DiPietro,who works with clients large and small as the president of the Marlton,N.J.-based DiPietro Marketing Group.

Many marketers continue to rely on basic campaign performance data as the primary or even sole metric for measuring success, according to DiPietro. People often get caught up in the measurability of online campaigns and miss the ultimate corporative objective of a marketing campaign – to increase profitability.

Despite many marketers’ incomplete understanding of how buying keywords affects the bottom line, search marketing spending continues to grow rapidly. According to a survey conducted by the Search Engine Marketing Professional Organization (SEMPO), advertisers in the U.S. and Canada spent $5.75 billion on search engine marketing in 2005, up 44 percent from 2005. Search engine marketing spending in North America is projected to reach $11 billion per year by 2010.

Some marketers whose careers started in the brick-and-mortar world have seemingly become spellbound by the top-level data for measuring marketing campaigns and forget their “old-school” fundamental tenets about increasing sales and stockholder value, according to DiPietro. Finding methods of doubling the conversion rate of a keyword campaign is admirable, but who cares if sales don’t grow? Estimating the value of a keyword purchase by focusing on clickthrough rates or increasing traffic to the website is an easy way to justify spending, but may be totally meaningless, DiPietro says.

The clickthrough ratio is analogous to the batting average in baseball – it is easy to compute and understand, and therefore is the most relied-upon statistic. However, during the past few decades, baseball executives such as the Oakland A’s Billy Beane, who probe deeper into statistics, have learned that other metrics – such as on-base percentage – are more directly related to achieving the objective (scoring more runs). The A’s have managed to succeed while spending considerably less than competitors, and many fellow baseball executives now are looking beyond the batting average. Similarly, marketers who identify the metrics that more closely correlate to their specific goals can increase their success.

MATCHING GOALS

Getting customers to your website is an important first step in increasing revenue, but determining the return on the investment requires analyzing what happens after they arrive at your doorstep. “You must have an action attached to [increasing traffic] or the campaign is useless,” says Douglas Brooks, vice president of consulting firm Marketing Management Analytics.

Before embarking on a campaign, marketers must define the objective – be it increasing leads, sales or brand recognition – and apply the appropriate metric, according to Brooks. The most appropriate metric may depend on whether the company is focused on e-commerce sales or if sales staff is usually involved in any transaction. Different yardsticks are appropriate for companies that use their website as a direct sales channel than for companies who are focused on generating leads that are converted off-line, he says.

Companies that rely on sales personnel should look at the volume of leads a campaign generates, according to Jerry Moyer, manager of analytics at interactive agency Refinery. Moyer says he tells his media clients – many of whom continue to focus on clickthrough rates – that tracking leads is a more effective barometer of campaign performance.

Campaigns that drive traffic to a website that cannot identify where visitors came from may be over- or underestimating their effectiveness, according to Moyer. By using first-party cookies and analyzing all of the activities that occur over time, advertisers can better understand the value of the leads generated.

Using cookies enables marketers to identify the unique visitors, according to Andrew Hanlon, who owns advertising agency Hanlon Creative. Cookies enable companies to track how many times a visitor was exposed to messaging during an entire campaign, as well as counting the total number of interactions on a website before visitors enter personal information and become a lead. “Unique visitors is the most raw level of success; you have to consider how many [leads resulted],” Hanlon says.

For example, Designer Linens Outlet implemented first-party cookies and saw revenue from returning customers increase by 45 percent and shopping cart conversions increase by 20 percent, according to Web analytics firm WebTrends, which managed the campaign.

Measuring the quality of leads is as important as the clickthrough ratios or total Web traffic generated by a campaign, according to Hanlon. He says many of his Hatboro, Pa., agency’s clients ($20 million to $1 billion in sales) “rarely know what they are asking for” when trying to gauge the impact of campaigns on sales.

He stresses to clients the importance of tracking leads throughout the entire sales process. “The client has to be able to act on the data – what happens with the lead after it is collected,” he says. The ability of keywords to generate leads varies widely, says Hanlon. Marketers should use metrics that create quality leads versus those that merely drive traffic.

If branding is the goal, then measuring increases in traffic can be appropriate since many keywords generate low-quality leads, Hanlon says. Companies looking to reinforce messaging through multiple media should consider several online metrics, according to Jason Palmer, vice president of product strategy at WebTrends.

LANDING CLIENTS

Some campaigns are incorrectly viewed as ineffective because of low conversion rates, according to consultant Hanlon. Landing pages that were not designed to entice visitors to delve deeper into a website could turn away potential leads, so their effectiveness must also be evaluated. Landing pages should have interesting content such as blogs or unique offers to encourage clickthroughs, says Hanlon. Companies should measure conversion ratios after visitors hit a landing page, and if they are shown to be “dead ends,” they should revise the landing pages to add more content, he says.

Software companies including WebTrends and Salesforce.com are developing applications that zero in on landing-page performance. For example, Webtrends Dynamic Search evaluates the effectiveness of the landing page and keyword in matching specific company objectives.

Tweaking the content of a landing page can increase the percentage of clicks converted to leads by as much as a factor of 10, according to Kraig Swensrud, senior director of product marketing at Salesforce.com. Tracking and improving landing-page conversions is equivalent to increasing money spent on Google AdWords, he says. “Everything is interconnected – as soon as you have visibility on [landing-page] conversion rate, you can impact change,” says Swensrud.

FROM CLICKS TO SALES

The best metrics link gains in Web traffic or clickthrough percentage to the overall business objectives – increasing sales, profitability and effect on the stock price. Consultant DiPietro recommends the break-even sales analysis is applied to off-line marketing should be applied online. Companies should calculate how many sales – based on the profit margin per average sale – would have to be generated to determine whether or not a campaign is a good investment.

“Whether it’s participating in a trade show, setting up an affiliate program or a PPC campaign, work it back to break-even sales,” DiPietro says.

Connecting the dots between Web analytics and sales data has been largely a manual process for DiPietro, who spends more hours than he would like handcrafting spreadsheets to complete his analysis.

Web analytics firms such as WebTrends, Omniture and WebSideStory are addressing this software void with applications and services that can link Web and sales data to simplify calculating the return on investment. These applications can incorporate Web data such as traffic analysis, email marketing and search marketing performance with customer relationship management sales data.

Accurately gauging the value of a campaign to a company’s bottom line, tracking a visit as it becomes a lead and until the sales cycle is completed is what should be measured, according to WebTrends’ Corey Gault. Web data should be combined with higher-level key performance indicators (KPIs) such as cost per visit, cost per lead and cost per sale, he says. “KPIs can also be combinations of various metrics, such as revenue dollar per marketing dollar spent, or percent of orders from repeat purchasers,” says Gault.

Measuring the lifetime value of online branding campaigns is challenging for companies that also sell off-line, as the ability to automate the process ends at the desktop. Refinery’s Moyer says customer surveys are an efficient method to link online with offline impressions. The surveys incorporate data collected by contacting customers about their behaviors before and after campaigns, and factor in both online and off-line (broadcast, print, outdoor) impressions. This enables companies to calculate how the campaign contributed to the overall sales effort, he says.

Metrics should factor in all of the times a company interacts with the customer, not only the most recent, which can skew performance data, according WebTrend’s Gault. “Many marketing analytics solutions credit the conversion to the last campaign touched, effectively undervaluing all the programs that initiated awareness and consideration.”

Vendors are also re-engineering their products so that sales data can automatically be integrated with Web analytics to complete the campaign-to-revenue analysis.

“The ability to tie marketing metrics with sales metrics is one of the biggest problems that customers have,” according to Salesforce.com’s Swensrud. To address the difficulty in understanding the impact of keyword purchases on sales, the company introduced Salesforce for Google Adwords late in 2006. The software, which is sold as a service, traces the leads generated by keyword purchases and follows them through the sales process to determine their return on investment.

COMPARING OPTIONS

Although comparing current campaign- to-revenue performance with historical data is informative, marketers should create a baseline of return on investment so that they understand the relative value of each type of online campaign.

The cost per thousand of a keyword campaign may seem relatively low when compared with cost of an email marketing campaign. However, determining the return on investment of each can justify what appear to be higher costs per customer contact, according to Marketing Management Analytics’ Brooks. He says calculating the individual return on investment for each type of online campaign enables an apples-to-apples comparison.

For example, the lifetime value of a customer acquired through keyword buys might be a fraction of that of someone originally contacted via email. After factoring in revenue, marketers can better decide the best marketing mix for their collective media expenditures.

The volume of statistics contained in monthly Web analytics reports can make it a challenge to interpret the metrics that matter most. The bottom line: Don’t forget about the bottom line.

JOHN GARTNER is a Portland, Ore.- based freelance writer who contributes to Wired News, Inc., MarketingShift, and is the editor of Matter-mag.com.

Overcoming Your SEO Fears

Ask nearly everyone and they’ll say that search engine optimization is intimidating. Search engine optimization – SEO for short – should be a familiar term and practice for anyone or any commercial company with a website. SEO is what you do to your website to get a higher ranking on search engines,particularly Google,Yahoo and MSN.The higher you rank the more likely someone will click through to your site and buy your stuff. Lately, information and tips on just how to do that can fill a library.

“I don’t think you can be in business without realizing that search is a big part of the tool you need – you need to have a strategy to be found,” says John Battelle, search guru and author of the book The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture.

And yet being found is still perceived as some sort of magic formula. “SEO is not sorcery or deception; just something that requires diligent research and staying on top of changes to the way search engines do things,” says Joe Balestrino, who runs the Mr. SEO website.

If someone enters the term pizza into Google, for example, the first results are most likely the product of SEO. Pizza Hut, Domino’s and Papa John’s have all made an effort to rank in the top three spots on Google. Whether they remain there is something search engine marketers will need to stay on top of. Search engine marketing – SEM – are the tactics employed in order to rank higher, be they through paid search or other nonpaid methods. It could be by transforming a website’s look and feel to gain higher ranking.

Many Search Marketers Fail to Measure Results Take the term iPod and plug it into Google. What you get is a sponsored (or paid) search result for Apple. The first nonpaid result is also Apple. Not a coincidence. The Apple brand is so strong that it ranks very high on unpaid results, and paying for a sponsored result is just bet hedging.

People who are new to selling on the Web can get very confused by the “science” behind SEO. Talk of relevant keywords, algorithms and cost per click can terrorize Web sales newcomers. It’s an issue that continues to frighten brand-name companies as well. Since the concept of SEO is only about eight or nine years old, most companies have typically hired a chief marketing officer with as little as two years’ experience in matters of SEO.

Companies are also realizing that search engine marketing is a full-time job and have created executive positions just to monitor and enact SEM strategies. Companies that will do your SEO for you are growing as well. Books and conferences continue to provide advice whether you are a newbie or have been practicing SEO for awhile.

While trying to demystify SEO for people who have gone to a few dozen websites and have not been able to understand it, we can’t ignore the advancements in SEO and how big the market has become. Search still finished first in online ad spend in 2006, to the tune of 40 percent of total online advertising revenue, according to the Internet Advertising Bureau and PricewaterhouseCoopers. This trend of 40 percent is predicted to continue through 2010, according to eMarketer.

Back when there wasn’t a name for SEO, the tried-and-true way to rank high on search engine results pages was using as many keywords as you could in your content. If you sold cigars, putting the word cigar in your articles and written materials as many times as humanly possible would probably get you a pretty high ranking. With the ascension of Google and its algorithmic rankings, that doesn’t work so much anymore. Not to look too far under the hood, but the Google algorithm that ranks pages basically looks at who is linking to whom on the Internet and the quality of those pages. The more high-quality pages linking to you, the higher you get.

Most marketers employ a combination of SEO and paid search, also called pay per click, which results in a sponsored ad when someone searches for certain keywords. For example, that’s why searching for iPod brings up Apple’s URL in the sponsored position and as the first search result – or the “natural” search result.

Getting there has been considered by some as rocket science. And there is a current debate in the industry over whether SEO is too hard for the average Joe to execute effectively. Some consultants who do SEO say, of course, it’s a very difficult science. Critics claim that search gurus want to keep SEO sounding complicated so that they will continue to get your business.

“SEO is a new-school-of-marketing thought – switching someone’s beliefs is nearly as difficult as converting someone’s religion,” says Todd Malicoat, who consults on SEO from his StuntDubl.com site.

“I think that there’s a complete misnomer that SEO equals top position on the search engines,” says Dave Taylor, tech blogger at AskDaveTaylor.com. “In fact, smart SEO is much more about being findable for the specific keywords and phrases that will drive customers to your site, rather than just a more simplistic popularity contest.”

Job Functions Performed by Search Engine Marketers in the U.S. - 2006 That said, there is no denying that SEM efforts continue to grow. Forty-two percent of advertisers say that their SEM budgets are new, says the Search Engine Marketing Professional Organization (SEMPO), in its recent annual survey of marketing executives. The survey also found that 83 percent of advertisers prefer organic (or natural, nonpaid) search, while 80 percent put paid search at second place. Respondents stated that sales was their primary goal for SEM – 59 percent said this. Fifty-three percent said brand awareness was the primary objective and 48 percent said lead generation was the goal. SEMPO’s 2005 survey stated that the North American search engine marketing industry grew to $5.75 billion. That’s a 44 percent jump over 2004.

“Search engine marketing is growing at a faster rate than television, than radio, than print media,” pepperjamSEARCH.com CEO Kristopher B. Jones said at a press conference in August 2006.

While brands are becoming more adept at SEO, a battle is still ongoing behind the scenes between traditional advertising and search marketing. “I think the big brands are starting to get it, but yes, at a snail’s pace,” says Mr. SEO’s Balestrino. He says that while a few of the “heavy hitters” have been looking for SEO people to get them started, most still rely on the word of their SEM. “As you might imagine, few SEMs are into SEO because it can greatly reduce the need for PPC over the long haul.”

Balestrino adds that some companies are starting to see PPC “for the losing battle it can become,” especially among highly competitive retailers and service providers. “The company who is willing to spend the most can rank the highest, but the ROI is dwindling because PPC costs are rising faster than inflation,” he says.

StuntDubl’s Malicoat says that “I no longer try to claim that ‘branding is dead,’ but that certainly won’t keep me from kicking it while it’s down. It’s amazing how often I see a big brand completely blow top search rankings that could have been achieved with a little understanding, some initial planning and very little additional budget.”

Keywords are Key

The camp that believes SEO is not an intricate science say the first thing any SEO beginner needs to do is figure out what your relevant keywords are. Since the major search engines are organizing results based on the word or words people type into the engine, knowing how to organize your keywords is step one. Plus, coming up with all the applicable keywords for your site helps you understand much more clearly what it is you sell.

You can find these keywords by writing down as many as you can think of, or you can survey your core audience. You can also buy software to help you come up with words. There are sites and software to help you find the value in the keywords you have come up with, such as Overture.com (now known as Yahoo Search Marketing) and WordTracker.com.

Statistics associated with your keywords will help you decide what words get more traffic than others. Companies such as Trellian offer SEO software tool kits that help manage keywords, check your rankings, edit your meta tags and create PPC bid comparisons, among other things. For your cigar site, for example, “Dominican cigars” may get far more traffic than “Mexican cigars.” In fact, the latter may do so poorly as to warrant omission when it comes to keyword bidding.

What you are shooting for is a site that can be easily indexed by a search engine. Search engines send out automatic programs that look for new content and create an index based on the words on each website page. That means that attention to Web design and quality writing will boost your site’s chances of getting high rankings. For example, try to keep each Web page small in size, have no broken links, use correct HTML, a server that is up all the time, no identical Web pages on your site and good, clear navigation. All these elements help the site get indexed by search engines.

Some believe that little things such as title tags can make a difference in your site’s rankings as well. Title tags are the one-sentence descriptions coded into the HTML that are displayed at the top of the browser when you visit a page. Organizing your site in URLs that make sense also may contribute to rankings (URLs that show the structure of the site in words as opposed to a series of numbers). Something as simple as a sitemap page helps when you are being indexed.

Especially in Google the more sites that link to your pages the higher your ranking will be. This is called backlinking and is very effective when “high quality” sites link to you – and is even more effective if the link text itself contains one of your keywords. This may mean you will need to put on a public relations hat and send information or press releases to other sites. Joining industry Web forums can help get the word out as well.

This is how SEO has operated over the last few years. There have been many nuances along the way and those degrees of execution are what make SEO seem very impenetrable. In recent years, companies and website owners have opted to buy SEO from firms that do it for you – pepperjamSEARCH.com, Prominent Placement, Fathom SEO, SEOInc.com, EngineReady.com, Adoofa.com and iProspect, just to name a few.

SEMPO’s annual survey indicates that more companies are interested in outsourcing their SEO, but the overall numbers are still small. Only 26 percent of advertisers plan to outsource half or more of their paid-placement budgets for 2007. About two-thirds said they plan to do all of their organic SEO in-house. Only 10 percent said they would outsource all of their SEO needs. The high tech sector is also not immune to the difficulties of search. Among top software firms surveyed by marketing research firm MarketingSherpa, 25 percent were not “sufficiently optimized for search engine visibility.”

Brand Awareness

Some search pundits and bloggers continue to believe that big companies are slowly awakening to the power search brings to their brands. “Big companies are doing too little, and many small companies are too focused on SEO at the price of good content production. The magic bullet is just to produce lots of good, fresh unique content; not to play SEO games and trick people into linking to you,” says Taylor.

Having everyone on the same side of the fence would solidify search as a must-have for all companies. Currently the jury is still out about what constitutes the best approach to search. Some critics have written that SEO is a “one-time fix” – that once a site is optimized, you won’t have to touch it again. Counterarguments are that sites have to change as search engine algorithms change. “I think there are more than a few [SEO firms] that give companies the indication that over-thinking an SEO strategy is necessary, when in reality, it isn’t,” says Mr. SEO’s Balestrino. He says that most companies budget SEO expenditures pretty low, but not necessarily too low to be effective. He says to be wary of the “overly grandiose implementation.”

Others predict that the future of SEO is specialization – broad-category specialists who see SEO as “just plain marketing” and people who will specialize in areas such as keyword research, link building and analytics.

Search engine marketers are still having a hard time because so many of them working for mid to large companies are not focusing exclusively on SEO. A JupiterResearch/iProspect survey found that 88 percent of SEMs are doing SEO; however, 58 percent of them are doing website design, 26 percent do public relations, 44 percent do market research and 22 percent do direct mail. There are only so many hours in the day and only so many hats for overworked SEMs.

This is where legitimate SEO firms hope to gain ground. While MarketingSherpa research stated that SEO firms were still mostly “mom-and-pops,” staffs are growing at these firms and client accounts have doubled. MarketingSherpa says there are a handful of SEO firms reporting more than $10 million in revenues from SEM work and there are at least a few companies reporting $20 million in SEM revenues. The research reveals that most of these businesses have only been in operation for about four years at the most.

Some say that the buy-in from companies who could use search isn’t complete. SEMPO’s 2005 marketing survey stated that only 37 percent of companies said that executives were “moderately interested in search engine marketing practices.” Even companies who would like to outsource their SEM appear to be intimidated by the choices. “The biggest mistake is not doing enough homework on who is reputable and what works,” says StuntDubl’s Malicoat. “Companies should search names, company names, past company names and really be diligent in learning what is going to work best for them.”

Battelle is straightforward when it comes to telling companies what they need to do. “It is hurting companies that don’t use search,” he says. “It is our user interface. It is like a listing in the Yellow Pages.”