Workday Rising 2014 Recap: Predictive Analytics Take Center Stage as Workday’s Pace of Innovation Accelerates

 KeithMattioli015 Keith Mattioli,
Principal, Advisory, Enterprise Solutions

​Last week I attended Workday Rising, the annual forum that brings together Workday executives, customers, prospects, and partners to celebrate the past year’s client successes and share Workday’s vision and direction. Now that a week has passed, I have a few post-event observations.

First, the atmosphere at Workday Rising 2014 felt very different than a typical software conference. Even with 5,000 attendees—which certainly isn’t huge by Dreamforce and Oracle OpenWorld standards—Workday Rising seemed much more like a users group meeting than a vendor event, and in fact, it’s billed on the Workday Web site as “a community for customers to meet each other and share ideas.” It felt like a bunch of like-minded people getting together to share ideas and have fun.

The rate of innovation from Workday is accelerating. We heard about a slew of new products and features, including, notably:

  •  

    Mobile isn’t the only area where Workday is refusing to rest on its laurels. 

    A new mobile experience for Workday for iPad and Workday for iPhone. Already a leading user interface for mobile devices, Workday definitely isn’t being complacent—the company continues to innovate. Workday’s “elegant, personal, and efficient” applications have added animations to bring the design foundation to life and create a familiar and intuitive feel. Users can also personalize their mobile experience. Over time, their search results, prompts, and menus will improve based on how they use the product. Workday Rising attendees could upgrade to the new UI on the spot on an opt-in basis.
  • Updates to the look and feel of Workday HCM. Mobile isn’t the only area where Workday is refusing to rest on its laurels. The company continues to improve its core applications with improvements to some of the main HCM components—such as a consumer-grade user experience. This was also made available immediately to customers outside of the standard Workday update cycle. No reason to keep the users waiting!
  • Continued innovation with the core applications. In addition to the improvements to Workday HCM, Workday is making “standard” enhancements to its core products, including HCM, Workday Financial Management, and the integration cloud platform, with a focus on providing real-time information to business users.
  • Predictive analytics and big data took center stage. The big news at Workday Rising was the unveiling of Workday Insight Applications. This is a new suite of applications that uses advanced data science and machine learning algorithms to help customers address different business scenarios. This was the big splash for Workday. It showcases the value of a consolidated enterprise platform that contains all of an organization’s data in one place, easily accessible, with the ability to do true predictive analysis versus just reporting. It’s a powerful tool. Integrated into the solution, it allows business users to not just see what happened last month and last quarter, but to start to look ahead.

During the first day’s keynote, Dan Beck, vice president of technology products, and Adeyemi Ajao, vice president of technology strategy, gave a great example around employee retention. They showed how Workday can display the top performers who are most likely to leave based on internal (years in position) and external (job searches) data. Managers can then be proactive in recommending job changes for each person, each change ranked by its impact on retention risk. Powerful!

  • Accelerating technical change. One of the other key areas that Workday continues to improve upon is its core infrastructure support—not the traditional “ping, power, and pipe,” but rather how Workday manages the application within that environment. Workday updated nearly 700 tenants (production customers) to Workday 23 in six hours. Their goal was 12 hours. That’s amazing. How long did it take you to upgrade your legacy platform? It probably wasn’t measured in hours. Multiply upgrade time by the number of legacy platforms out there, and you get a huge productivity loss across enterprises. Maybe we should come up with a metric—there’s already one that estimates lost productivity due to the time we spend sitting in traffic.

    How long did it take you to upgrade your legacy platform? It probably wasn’t measured in hours. 

  • Workday really listens to its customers’ suggestions. Workday Brainstorm is the mechanism Workday uses to solicit product enhancements. The community of Workday customers then votes on the ones they think will have the most impact. It’s an interesting process in which a lot of lobbying goes on. Workday incorporated more than 100 of these customer-suggested enhancements into the latest release—which represents a significant percentage of new features!

This customer involvement just reiterates how different the Workday community is from those of other traditional solutions. The level of collaborative customer engagement—in which ideas and artifacts (integrations, reports, etc.) are openly shared—creates an environment where companies receive broader value because they’re all on the same version of the solution, so each improvement benefits the entire community.

As a leading provider of Financial Management and Human Capital Management services, KPMG draws upon extensive industry knowledge and experience to help transform business using enterprise technology.

Providers Being Held to a MUCH Higher Standard

justice Cliff Justice,
U.S. Leader
Shared Services & Outsourcing Advisory
Brown David J. Brown,
Global Leader
Shared Services & Outsourcing Advisory

Clients are rapidly losing patience with service providers that aren’t working proactively to provide more value than the basic terms of the original outsourcing contract. Many clients are actively looking to fire their provider if they cannot get past operational teething issues and begin the process of transforming the way they do things.  Outsourcing is no longer about achieving significant cost reduction targets and getting basic tactical operations functional – it’s about moving clients into a future state that is much more effective than the current one. Simply running client engagements as cheaply as possible with limited investment is a sure route to failure.

25% of clients are actively looking to fire their provider.

Less focus on the deal, more on the relationship.  Providers are frequently forced to say what they need to win the deal, as opposed to offering a strategic partnership that can add more skill, technology, and analytical capability to clients.  Some clients are beginning to doubt their current provider actually has the skills or acumen that they were promised during the early courting days, prior to contract signing. Our research shows that close to a quarter of clients will actively seek to eject their current provider if they have not effectively helped them standardize, automate, and transform their processes within the next two years.


Related Reading

The Future is Here: Enterprise Services Governance
Gain the essential perspectives and tools needed to build and grow an enterprise services governance organization.

Managing Risk in Global Business Services Operations
Anshul Varma explains risk from a three-dimensional view as it relates to GBS operations.

Executive Report: The State of Services and Outsourcing in 2014
Gain insight from 1079 enterprises, outsourcing service providers and industry consultants on where the services and outsourcing industry is heading in the near and long-term future.

2014 Research Finding – Cloud is Already Replacing Legacy Outsourcing

justice Cliff Justice,
U.S. Leader
Shared Services & Outsourcing Advisory
Brown David J. Brown,
Global Leader
Shared Services & Outsourcing Advisory

Ambitious and sophisticated clients are now seeing the huge benefits of shifting from on-premise to “As-a-Service” delivery. This isn’t something that is occurring in a few years, it’s already happening as our latest research shows close to one-in-three enterprises already using (or about to use) BPaaS/cloud as an alternative to legacy outsourcing in areas such as HR, industry-specific operations, finance and accounting and procurement.

Having a provider that understands and can implement a cloud platform, support the transformation and provide the necessary services that add real value to the front-office is the Holy Grail for many buyers. With half of today’s outsourcing contracts potentially up for grabs, those providers with genuine platform plays are in position to pick off legacy outsourcing contracts that have stalled in finding future value.

The successful providers will be those that can bridge the divide between cloud/infrastructure and process delivery. Most buyers will tell you that having to deal with the technology and operations divisions of providers is akin to dealing with two separate companies. There is often very little synergy – and very different cultures – between those that deliver business processes and those who design, develop, and maintain technology solutions. When providers separate process delivery and operations from technology, it nearly always makes it challenging to get the right access to funds, align the stakeholders that matter, and make meaningful, credible business cases. Having operations and infrastructure in the same unit creates a greater opportunity for transformational thinking at the solution-architect level in operations, by increasing awareness of – and exposure to – emerging technologies.

chart-20141030


Related Reading

The Future is Here: Enterprise Services Governance
Gain the essential perspectives and tools needed to build and grow an enterprise services governance organization.

Managing Risk in Global Business Services Operations
Anshul Varma explains risk from a three-dimensional view as it relates to GBS operations.

Executive Report: The State of Services and Outsourcing in 2014
Gain insight from 1079 enterprises, outsourcing service providers and industry consultants on where the services and outsourcing industry is heading in the near and long-term future.

The Internet of Everything – Riding the New Wave of Business Opportunity

Hoss_Justin Justin Hoss,
Principal, CIO Advisory
pull-quote4

We are now at the beginning of a new technology wave that will become a tsunami as wearable devices (fitness bands, watches, eyeglasses), automobiles, appliances, and sensors of all kinds connect to the Internet. It’s called the Internet of Everything (IoE). Whereas the Internet of Things (IoT) connects physical devices to the Internet, the IoE enables connected devices to take autonomous actions based on real-time data, processes, and information.

$19 trillion up for grabs

Other than lots of connected devices, what does all this mean? For businesses, it can mean quite a lot — up to $19 trillion of value (net profit) over the next decade for private sector businesses globally, according to Cisco.

In fact, the IoE has the potential to transform how we do business with real time insights, richer experiences, new products, more flexible service delivery, autonomous behavior, and greater revenues.Already, IoE early adopters include —

  • Manufacturers using smart factory and supply chain applications
  • Utilities deploying smart grid applications
  • Retailers implementing customer optimization applications
  • Healthcare providers using telemedicine for remote monitoring and remote delivery or services.

Key questions

For CIOs and CTOs, key questions about the IoE don’t involve “if” but “how, what and when.”

  • How will the IoE change the way our organization deploys connectivity and technology?
  • What should we be doing to remain agile and keep up with this new wave?
  • When can we capture some of the business value that the IoE is already creating?
IoE3

What’s holding you back?

Despite the new wave of business opportunities, many organizations still struggle with the idea of testing the IoE waters. In our KPMG white paper The Internet of Everything is Now,we identify major barriers to entry for IoE, ranging from “Separating the signal from the noise” to “How should I eat the elephant?” It also presents a multi-layer framework to help decision makers think through issues and begin to capture value by operationalizing the IoE for their organization.

Learn more about the IoE and how it is likely to impact you, your competitors, and your industry.

Harnessing The Power of Analytics

dipan Dipan Karumsi, Managing Director,
KPMG Procurement Advisory Services
gnome-mime-application-xhtml+xml KPMG’s Procurement Advisory services focus
on delivering enhanced business performance and
driving bottom-line savings. Let us help you transform
the way you source and manage your supply base.

In our last post we talked about spend analytics, the danger of the “other” category, and how it can easily lead to unreliable data. Today we’ll talk about how a standardized taxonomy can create dependable analytics.

analytics For many organizations, the United Nations Standard Products and Services Code (UNSPSC®), a standard product taxonomy that’s on the market, can be a good starting point for building or cleaning up categories. It provides standardized reporting, but it can also be overwhelming for some enterprises because of the level of detail it contains. This can end up creating more confusion, depending on the skill and experience of the people using the system.

You can edit the UNSPSC® down or roll some things up into larger categories, keeping organizational fit and alignment in mind. If you have too many categories and the distinctions are too fine, you’ll get the same item showing up in different categories. Then you’ll start to get those, “Where did my spend go?” types of questions which quickly lead to data distrust. On the other hand, if you decrease the categories available to the end user, sometimes they just won’t have what they need.

Every organization is different, and you have to identify what makes sense for your situation. Your taxonomy has to strike a balance between providing the visibility and data you need to manage effectively, while simultaneously not being too onerous for end users to adopt easily.

One key note: Taxonomy development should not be done in a bubble. Creating a spend taxonomy should be a collaborative process that includes a broad group of stakeholders. The more input you receive, the greater the buy-in will be on utilization and self-service reporting. pull-quote4

While some organizations opt to refine their taxonomy as they mature, I recommend they clearly define and deploy it up front. Constant changes in the way users see the taxonomy can cause confusion. In addition, spend reporting will be impacted if categories are removed, edited, or added, and alignment of workflow approvals to categories will need to be updated each time it is changed.

Most important of all, do not create a category called “other” (or “miscellaneous” or “uncategorized”.) Getting real time visibility into spend is one of the big goals here. Allowing this kind of category automatically introduces the need for a manual process, which causes delays and inaccuracy.

If “other” is not an option, end users will learn to find the right category. Procurement can provide information and training to help. Since most requesters concentrate on a handful of 10-15 categories, they’ll be very effective at selecting the right ones once they learn them.

An up front investment in creating a thoughtful taxonomy is going to give everyone across the enterprise the ability to run accurate, real-time ad hoc reports in the eProcurement application. This is the power of analytics, and once people experience it and see how much easier it makes their jobs, having no “other” will be no issue.

For Spend Analytics You Can Trust, There Can Be No “Other”

dipan Dipan Karumsi, Managing Director,
KPMG Procurement Advisory Services
gnome-mime-application-xhtml+xml KPMG’s Procurement Advisory services focus on delivering enhanced business performance and driving bottom-line savings. Visit us on the Advisory Institutes to learn more about how we combine thought leadership and global perspectives to help you transform the way you source and manage your supply base.

In my last post, I talked about how timely access to data can help procurement organizations avoid embarrassing moments with suppliers. Now let’s turn our attention to getting quality data downstream.

analytics This requires not only having broad adoption of the eProcurement tool, but also having a well-defined category taxonomy and making sure people know how to use it accurately. The taxonomy is where many organizations fall down; they simply don’t put enough time and thought into developing it, and thus create continuous challenges in getting access to quality spend data.
Developing a well-defined taxonomy takes discipline and thought. Yet, in the rush to deploy the system, organizations may create a bare minimum of categories without the appropriate levels, leave the category field optional, or even skip this step altogether, which then creates a rush to create this information at the time of go-live. pull-quote-1

Or, they may create a category called “other.” If you want analytics you can trust, you need a category for everything. There can be no “other.” “Other” gives people who don’t want to think too hard an easy way out. Before you know it, you’ve got “other” showing up as 20% of your overall spend, which requires a significant re-categorization effort and can cause considerable data distrust.

An unorganized taxonomy also creates a lot of churn downstream. If you have transactions being routed by category to the buying organization, they’re going to have to research and identify the correct categories, adding time and manual effort into the process. The most efficient way to get quality data out of your analytics system is to make sure your taxonomy and process is defined across the entire value chain from the start. Think of it as pre-analytics.

One of the first things we do when we work with an organization on sourcing or procure-to-pay is look at the spend taxonomy to see how it’s organized and understand how well the organization aligns around it. Is everything in alignment, or are there a number of unmanaged categories that have significant areas of spend? Is there a lot of spending in an “other” category?

Insight into this information brings us to the topic of taxonomy development and how it can unleash the power of analytics, which is where we’ll pick up in the next post.

Digital “Haves” and “Have Nots”

By Lee Ann Moore, Director, Industries and Marketing

In the video below, KPMG’s Dave Wolf and John McCarthy of Forrester Research discuss how organizations need to rethink their approach to technology to support a digital-enabled business model – one that is both customer- and employee-centric. It requires agility and an iterative process, often quite different from historic approaches to technology implementation and upgrades. Historic approaches, typically managed by static organizations, took an inside-out view as opposed to the customer-centered, dynamic outside-in perspective expected today. This shift may require a digital transformation, or at least a planned journey toward digital enablement. Without embracing these changes, organizations risk becoming “Digital Have Nots” and falling further behind in today’s increasingly dynamic and competitive marketplace.

Cynergy Video

For further information or ideas on a customer-centric approach to digital transformation, contact David Wolf or Lee Ann Moore.

2Q 2014 Pulse Survey Results: Five Key Takeaways for Global Business Services (GBS)

Stan Lepeak, Global Research Director, KPMG LLP

2Q-Pulse-Five-Takeaways_1

One key to driving greater global business services (GBS) maturity is improving the underlying capabilities of IT, both as a collective set of technologies and as the IT group, used to support GBS efforts, and better aligning them with these efforts. More standardized IT applications and systems enable a more integrated IT environment. Standards are critical to creating more integrated and end-to-end GBS operations across functions, geographies, and business units. Key to enabling this standardization is greater integration of IT operations into the GBS organization as well as formalized joint governance structures. It is also critical to define specific IT “innovation” strategies to exploit the creative potential of existing and new technologies, such as robotics process automation, to drive GBS maturity and integration.

Read: 2Q14 Global Pulse Survey Report

2Q-Pulse-Five-Takeaways_2

Challenges abound in any major service delivery change effort. They are many and varied, and unfortunately most of them are not new. They include weak or neglected change management capabilities and the perennial retained organization, transition, and governance issues. While organizations have improved their skills to address these challenges, the steady increase in the scale and scope of today’s GBS efforts continues to increase the complexity of these issues. To succeed, organizations need to balance their GBS ambitions against their capabilities to successfully address these challenges.

Read: The Art of Services Governance

2Q-Pulse-Five-Takeaways_3

Greater standardization of IT applications and systems, particularly enterprise resource planning (ERP) systems, is a key enabler of greater GBS maturity and integration. While moving towards a single instance of ERP to support GBS operations is not practical for many companies, it is advisable to at least strive for a model with a common ERP architecture and template across multiple instances of ERP and related enterprise systems.

Read: Service Integration: Maximizing the Benefits, and Minimizing the Risks, of a Multi-Sourced IT Environment

2Q-Pulse-Five-Takeaways_4

As greater focus is placed on the role of IT in enabling GBS success, IT organizations need to start to assess and “benchmark” the capabilities of their own GBS IT operations against industry best practices. Emerging frameworks such as enterprise services management provide the potential to help enable this.

Read: Executive Dilemma: Is Benchmarking the Right Path to Defining Opportunities for Improvement?

2Q-Pulse-Five-Takeaways_5

There are many drivers for organizations’ service delivery improvement efforts and even more challenges faced in successfully completing these efforts. Reducing operating costs remains the top driver; however, it is an assumed and tactical benefit. Improving process performance as well as global service delivery capabilities and operating models have gained in importance as GBS drivers. End-to-end process ownership, integration of GBS efforts across functions, and better integration of shared services and outsourcing efforts under the GBS umbrella are some of the keys to achieving these goals.

 Read: Drive to the top: The Journey, Lessons, and Standards of Global Business Services

For more on Global Business Services and other related topics, please view our library of papers on the KPMG Shared Services and Outsourcing Institute.

Spend Analytics Helps Avoid Awkward Moments With Suppliers

Dipan Karumsi, Managing Director, KPMG Procurement Advisory Services

Working with organizations in various sourcing and procurement roles over the past 17 years, one of the common challenges I see is not having access to information on a timely basis. The good news is, real-time, or near real-time spend analytics is within reach for companies willing to undertake a process of continual improvement, and there is a definite competitive edge to be gained by doing so.

Based on what I have seen, a three-month lag from the time of spend to availability of reporting is common, and many organizations cannot get access to data about spend that took place as long as six months ago. Real-time data is almost out of the question for most organizations, because by the time they run the reports, check for data integrity and test to make sure there are no big issues, a couple months have gone by.

The further you get down the road, the more likely it is that you’ll be looking at many transactions that will never recur, spending valuable time and effort on things you can’t optimize.

Lack of timely data, or bad data, can also create some awkward moments, such as having to go to your suppliers and ask them how much money you spent with them. This happens more often than you would think, and the supplier will simply ask, “For what time frame?” and they’ll usually be able to run the reports very quickly because they know their business on the customer side and can give you detailed information.

Not only is this an embarrassing situation, you also run the risk that they only give you what you ask for, or withhold material information. They’re not going to want to share anything that puts you in a better negotiating position. Robust spend analytics helps you avoid these awkward moments and show up to supplier negotiations with more information than they have.

Better, faster access to information lets procurement organizations make smarter decisions across the board, leverage their spend across business units or plants or sites, and really drive value through savings in the sourcing process. This is one of the goals that many companies are looking to achieve through their e-procurement initiatives.

Both direct and indirect spend are critical areas for organizations, though traditionally the focus has been on direct spend. In recent years I have noted that attitudes about indirect spend have changed significantly. Many organizations which previously ignored indirect spend because it was “less strategic” or smaller in value have begun to take interest in this spend, recognizing that the opportunity to drive value is in fact quite significant.

Most organizations are now organizing the procurement function to focus on indirect spend but in many cases have not invested in people, processes and technologies needed to manage it. They are just now starting to realize there are many applications that focus on getting visibility into indirect spend to drive additional value.

As organizations evolve in this direction, a question that quickly surfaces is how to set up spend analytics. One way is directly through the procurement tool. Another option is to pull the information into a broader data warehouse that has access to all the other areas of spend, including direct spend, so can you can generate reports from a single database. If you have global operations, you may want to pull data from all geographies so that you have access to all of your global data in one location. There are different ways to do it, but a key consideration is the speed with which you get access to that data.

The other key consideration is of course data quality. I see many companies that have access to data, but people don’t trust it. What ends up happening is they have to go to the individual parties in the organization that are doing the purchasing and ask a lot of questions, or again, back to the suppliers to get the information. It’s inefficient, and it’s humiliating.

The organizations I see that are doing this well are those that have e-procurement platforms and have placed a focus on driving adoption across the organization. They’ve set up a well-defined spend taxonomy, and made sure people know how to use it. Categories are required for all spend, and they have a governance structure with clearly defined roles that have the ability to approve category selection within the taxonomy. Additionally, they don’t use a category called “other” or “miscellaneous.”

Ultimately, the most efficient way to categorize transactions is to do it up front; this ensures a quick flow into the downstream analytics system and gets you quicker access to the information.

Remember, getting access to real time visibility is the big goal here. That doesn’t happen overnight, but as you start to get your category taxonomy deployed, upstream systems straightened out, and processes established, the effort around data validation should decrease.

The data should get better and better over time, so you get to the point where users can run reports in the application on an ad hoc basis with complete confidence. Then they are empowered to make smarter decisions and avoid those awkward supplier moments.

 

KPMG’s Procurement Advisory services focus on delivering enhanced business performance and driving bottom-line savings. Visit us on the Advisory Institutes to learn more about how we combine thought leadership and global perspectives to help you transform the way you source and manage your supply base.

Taking a Pulse on Global Business Services – Moving Up the Maturity Curve an Uphill Battle

Stan Lepeak, Global Research Director, KPMG LLP

KPMG recently released the results of its quarterly, global 1Q14 Sourcing Advisory Pulse surveys. These Pulse surveys provide insights into trends and projections in end-user organizations’ usage of global business services (GBS). The learnings are gleaned from KPMG firms’ advisors, who are working closely with end-user organizations that are actively exploring or undertaking GBS initiatives, as well as from leading global business and IT service providers. We highlighted the top findings from the 1Q14 Pulse in a prior blog.

The first-quarter edition of the Sourcing Advisory Pulse took a deeper dive into the state of GBS usage among organizations in the market today. GBS adoption rates continue to accelerate as illustrated in the Pulse survey, KPMG’s client work, and via a recent wide-ranging KPMG market study on GBS trending. For some organizations, “GBS” is mainly lip service to a new acronym using old service delivery models of shared services and outsourcing. But for others, it is a more fundamental rethink of back- and front-office service delivery best practices. Increasingly for more progressive organizations, GBS is an operating model and strategic governance framework that helps them transform end-to-end business support services to deliver more effective solutions for both the internal and external customers of the organization. GBS efforts focus on optimizing the mix of human capital, service delivery models, process innovation, and technology to deliver services on an enterprise-wide, cross-functional basis, to support the business strategy. Many of these organizations are continuing to expand the scale and scope of their GBS effortsto include a broad range of business and IT functions and processes. This expansion is also occurring geographically, cross-functionally—linking and integrating adjacent functions—and from an end-to-end process standpoint.

The maturity of GBS capabilities in most organizations adopting the model, however, is still often evolving, or more bluntly, quite weak overall in most functional areas where the model is applied (see Figure 1). While more organizations are improving GBS maturity within individual process areas such as finance and accounting, or within a single business unit or geography, most still struggle with managing processes on an end-to-end basis (i.e., source to pay as opposed to just within account payables) or between operations supported via both shared services and outsourcing. In this respect, as GBS efforts expand, the bar is being raised on what is “good enough” much less best in class GBS capabilities.

9253 Fig 1

 

Yet pursuing a GBS model with the goal of end-to-end process ownership across all geographies and business units in a highly integrated manner is not practically achievable for most organizations and in many cases impractical given overall business models. A firm organized in a looser, federated, or holding company model, for example, would benefit less from a highly integrated back-office GBS model and the level of effort it would take to achieve. In this context, not all firms pursuing GBS maturity should necessarily strive for the top level of maturity, but most organizations can drive much more GBS maturity before they achieve what is optimal for their firms’ situations. It is also critical to balance their GBS expansion ambitions with their capabilities to adequately manage these efforts.

Driving GBS maturity is a multidimensional effort. Inherent to improving GBS maturity is greater consolidation and leveraging of common IT systems and applications and business process models including adoption of leading practices to deliver the in-scope services more efficiently and effectively. End-to-end process ownership, overall governance between collective shared services and outsourcing efforts, the integration of IT services into GBS business functions, and talent management of GBS resources are all common areas where many organizations must improve capabilities. While determining how mature is “enough” depends on many organization-specific factors, most GBS adopters are at the early stages of their maturity improvement efforts. Improving GBS maturity is critical to achieving the typical stated benefits sought from GBS efforts but also to maintain organizational operational competitiveness on a global scale. Indeed, KPMG research increasingly finds direct correlation between improved GBS maturity and market leading or at least market competitive financial performance as assessed by such metrics as profitability and return on equity.