Skip to content


Toys and Big Data

Ken-And-Barbie

“Dad, if my character dies in the game, would I die in the real world?”

What a beautifully naive question that my son, Trevor, asked me during a son-dad conversation about how games might change over the years.

Earlier last year, Mattel’s CEO, Bryan Stockton, was fired. After three years, it was clear that Mattel was continuing to be challenged with sales weakness, and lower gross margins, which drove down shareholder value.

As parents, we ALL know that it’s a very competitive toy aisle, and our kids are much different than we were at their age.

Mattel’s toys haven’t been “good enough” at a time when peers like Hasbro and Lego continue to report higher and higher sales. It’s not just Mattel. Nintendo, the one-time market leader video games brand best known for legendary characters like Super Mario, has been struggling to keep up with the times as mobile gaming explodes and “next-gen” consoles become cutting-edge.

So what’s happening to the toy market?

A New Toy Generation

I grew up as a child of the RPG generation (Role-Playing-Games), starting with my own “Ken and Barbie” equivalent with my Hasbro Stretch Armstrong. I then graduated to the Lego era (starting with my Grandma’s Legos from Berlin), to my favorite era of HotWheels, and then Mattel Tyco Toys slotcars (by the 1980s, Tyco dominated the electric slot car racing market as well as the radio control category. Mattel acquired them in 1997).

I had a wonderful childhood of imagination where I played the roles of many super heroes on many adventures. As parents, we forget how wonderful our “inner stories” and games were.

“Dad, I’m busy right now. I’m in the middle of a story.”

My son would stare out the car window telling himself a story….imagining himself in the middle of some wonderful scene…something he thought up as part of his own imaginary world. I love watching him and his brother, Devon, role playing battles, creative worlds, that they both dream up daily.

So why isn’t Ken and Barbie, MegaBloks (Mattel’s version of Legos), and their many other brands like BOOMco not fueling this new generation of creatives, like it did mine?

How Are Kids Engaging Today?

I’ve never graduated from being a kid. Some of my friends say that I’m just a kid in an adult body. I recall telling my high school friends that I couldn’t wait to have children, just so that I could play with their toys.

Clearly the answer to the Mattel dilemma varies a bit based on age. However, there is a common theme, starting even with the youngest children – It has to do with the fact that most of our world is becoming digital.

For decades, children’s “digital experience” was essentially limited to watching television or listening to music. Few parents complained about a child becoming addicted to listening to music, or being addicted to television. However, it has now been a growing concern among parents, and has now extended beyond TV.

Today, parents not only need to be vigilant about how much television their children watch, but the many other forms of media coming from the internet, smart phones, iPods, iPads, Wii games, and the like.

Kids today spend over 50 hours of “screen time” every week. In our family, “screens” include the TV, any computer device, and any phone. Kids will go to many extremes to get on a “screen”.

“Trevor…Devon, where are you?”

My wife will call my boy’s names to find out where they are in the house…only to find them tucked under the crawl space under their beds hiding behind their respective screens.

The media content they consume has a profound impact on their social, emotional, cognitive, and physical development.  Learning how to use media and technology wisely is an essential skill for life and learning in the 21st century. But parents, teachers, and policymakers struggle to keep up with the rapidly changing digital world in which our children live and learn.  Now more than ever, they need a trusted guide to help them navigate a world where change is the only constant.

RPGs Evolve to MMORPGs in the Digital Domain

Role playing with action figures has evolved into a suite of digital and virtual environments that provide massively multiplayer online role-playing game experiences. Why wouldn’t my kids want to move from playing with their lego figures to playing minecraft with a host of their friends? In fact, games like Wynncraft and Phyria are bringing digital games for kids to a whole new level. But this all scares me.

Do I want my kids to connect with their friends through a screen….or, rather, be outdoors with their favorite toys? In both cases, their creative natures are fueled. I’ve seen some very creative minecraft worlds constructed by both boys. But I’m torn. I think I’d prefer to see that same “world” constructed in legos in the backyard, hidden under a wood box with rocks on it. Wouldn’t you?

How do we combine offline and online experiences, providing a healthy balance?

 Toys + Gaming = IGTs (Interactive Gaming Toys)

According to the NPD Group, three out of four parents (77%) stated purchasing an IGT, a new generation of toys, was worth the investment compared to other types of toys or games that could have been purchased. Almost two-thirds of parents stated that they are extremely or very likely to purchase a new IGT game (65%) or a new character (67%) in the next six months. So what is an IGT?  Also known as “Toys to Life“, it’s an approach where our children’s toys become more real:

  • LEGO Dimensions: Lego Dimensions is an upcoming Lego action-adventure video game developed by Traveller’s Tales and published by Warner Bros. Interactive Entertainment, for the PlayStation 4, PlayStation 3, Wii U, Xbox One, and Xbox 360. LEGO has grown the dollar share of females in recent years. So, with LEGO Dimensions there is potential to shift the IGT consumer to be slightly more female.
  • Star Wars with Disney Star Wars 3.0: Disney Infinity 3.0 is an upcoming action-adventure sandbox video game published by Disney Interactive Studios and LucasArts for the Microsoft Windows, PlayStation 3, PlayStation 4, Wii U, Xbox 360 and Xbox One, and the third installment in the toys-to-life Disney Infinity series. Star Wars has cross-generational appeal, playing into the fact that IGTs target parents as well as children.
  • Amiibo: Much like Star Wars, Nintendo’s cast of characters has a cross-generational appeal to gamers. Many gamers grew up on Mario and Zelda, which has the potential to draw in a new, potentially older consumer into the IGT space.
  • Skylanders: Activision created this gaming segment, and each year they have innovated on the initial Skylanders concept. Ideas like Swap Force and Trap Team added new gameplay elements to the experience in recent years. Though details are scarce on a new Skylanders, I am looking forward to seeing what Activision has up their sleeves (likely to be revealed at E3 this year) and how it could also expand the market.

These are a combination of toys and digital games, combining the physical and digital worlds. It’s an interesting direction…one fueled by the interest of the next-generation child.

Toys & Big Data

CEOs of fortune 1000 companies all over the world invite my team into the board room to discuss how information (data) can help them truly become digital. Why? Because they know that data is at the center of their business. Data is at the core of a future suite of digital applications delivering new customer experiences.

This is all about “re-imagining” your business, by starting with the customer’s digital experience. In this case, it’s our children.

If you were Christopher Sinclair at Mattel, what would you be dreaming up for your children’s world of toys? Lets imagine a combined physical and virtual world that provides a completely new digital experience for our children. This is what Mattel’s senior team has begun.

In February this year, Mattel announced that it will offer “experience reel” cards that will offer exclusive content that will be available as a Google Cardboard application, specific for Mattel customers. This means that you can wear Mattel toy glasses and combine your physical and virtual worlds. You can enter into Barbie and Ken’s virtual world.

Imagine your children taking your home and painting it with colors they like, adding virtual furniture they prefer, hanging their own art on the walls, interacting with their favorite characters in various rooms.

Mattel also announced its partnership with San Francisco startup ToyTalk, which through a cloud-based app can enable your toys to have conversations. This supports the idea of conversational play. The doll uses speech recognition to record your kid’s conversations and store them in the cloud. The doll records any human speech it detects in an effort to intelligently respond. So, any human conversation within its hearing can be stored in the cloud and analyzed. On Christmas Day, Barbie could ask a child what they received from Santa. Or Ken could ask, “What do you want to be when you grow up?”

Digitally-enabled toys? Digitally-enabled play experiences? Combined physical and virtual worlds? MMORPGs? IGTs?

I’ve been involved with data and analytics infrastructure since I can remember….the early 1980′s. I’ve been trained on the nuances of how to collect, store, analyze, and operationalize data for a variety of use-cases for my entire career.

What I see, is an infinite number of opportunities to leverage data for a host of new educational, personalized, engaging toy applications. When you know what your child is thinking, interested in, worried about, toys can become a gateway for not only a personalized experience (e.g. when a toy responds back to your child, addressing them by their name), but the toy can listen to their worries and alert us parents about opportunities to assist in addressing our children’s fears, their curiosities, their thirst for knowledge.

Toy makers can use the same information to better classify children, constantly improving on their toy designs, their games, their educational curriculum.

“Dad, can we turn our house into legos?”

Imagine….a digital world where anything is possible:

  • Your family is in a rush to leave the house. Your toddler starts screaming because he can’t find his favorite stuffed animal. You pull out your iPhone and receive a signal that the animal is in the bathroom upstairs. You retrieve it in two minutes versus an hour long, blind search.
  • Imagine all Barbie dolls have iBeacons. If there is another doll in the area (with another girl), you could find her to see if she wanted to play. iBeacons could take ‘hide n go seek’ or tag to the next level.
  • Turn the real world into a secret virtual game. With all players outfitted with a Mattel toy iBeacon on them, you can play an impromptu game of freeze tag, meet the new flash mob game, etc.
  • You’re at Comi-con (a conference for comic fans), and your child would love to “run into” their favorite character. Pull up the event’s application and see a signal where that character is. When you’re nearby, tell you kid to close her eyes and make a wish to see if we can make the character appear. Within moments, the character comes around that corner and a wonderful memory was just made.
  • A child walks down street with connected baseball glove. As he or she walks by houses on his way to the park, other kids’ connected baseball gloves start to buzz. They look out the window, see a neighbor kid heading to the park, and go outside to play a pickup game. Technology brings back the “good old days” when kids actually played games in parks with other neighborhood kids.

These are just a few fun ideas from Jen Quinlan, who brainstormed a few ways that iBeacon technology could be applied with real-time analytics, contributing to a new digital world with new interactive digital applications.

Posted in Data, IoT.


Gerstner Secrets of Leadership

Lou_Gerstner_IBM_CEO_1995Lou Gerstner became president of American Express in 1985 at the age of 43. He dismissed the speculation that his success was the product of being a workaholic. Gerstner said, “I hear that, and I can’t accept that. A workaholic can’t take vacations, and I take four weeks a year.”

As I write this, I’m in Wyoming with the family enjoying Yellowstone and Jackson Hole thinking, “Can I somehow achieve the level of impact of Lou Gerstner with the right work-life balance?” What keeps people from having to cancel vacations, modifying schedules to take budget calls, or work while the family sleeps?

From 1998-2001, Mike Lawrie, CEO of CSC (where I work today), was General Manager for IBM’s business in Europe, the Middle East and Africa. Overlapping Mike’s tenure, Lou was chairman of the board and chief executive officer of IBM from April 1993 until 2002 when he retired as CEO in March and chairman in December. You can only guess that many of the tricks that Lou used to turn around IBM, were taught to many of his executives who were paying attention, like Mike. I share some of my perspective from what I’ve, in turn, learned….some of what I believe to be “Gerstner Secrets of Leadership”.

In general (again, if you are paying attention), I’m convinced that you can learn from several generations of the most advanced leaders.  I have enjoyed watching and participating in what I am now convinced is one of the most basic and yet most important components of leadership – a proper “management system“. Not sure why they don’t call it a “leadership system”….but here are some of my observations which I believe can be applied to any size enterprise.

The Wrong Management System

I first have to describe what I believe is a “typical” management system, run by “typical” leaders (or in many cases dysfunctional management systems by people who don’t lead, but simply manage).

There are many organizations which have been built based on the principal that a few people are “in the know”, while the rest of the organization is somewhat in the dark, waiting to be told what to do. In this situation, many do their job without having any appreciation or understanding of how their actions fit into the bigger picture.

The organization has no understanding of what the vision is, their exact role, and most importantly their ability to participate in being a part of the change. Those who know me, know that I prefer to lead with cultural change first, and my mantra of involving my staff in that change in an open and transparent way.

Many organizations are filled with people who manage by keeping information to themselves….thinking that either “knowledge is power” (so this is an intentional activity) or that “it’s none of your business” (somewhat less intentional). Why share what you are thinking or what you are doing with others when it’s not their job to know or, heaven forbid, have an opportunity to challenge your thinking?

Have you ever heard, “this is my responsibility, not yours….there are many other things you are not involved in – because your role is “xyz”. Let me handle this – ok.” or “There is no need to share this, that is part of the P&L that I run.”? I have heard these exact words from peers in the past. It’s a great example of the type of the behavior which is empowered within an organization that is run “hierarchically” versus one that is run “collaboratively”. I believe that this type of behaviour should and can be corrected with the right leadership…..starting with the right management system.

In many environments, a leader either intentionally or unintentionally is acting as the gateway for information. Leaders, by default, have more access to what is happening within the organization.

In a dysfunctional environment, dysfunctional leaders are a sponge for any/all information of how the business is being run…but it stops there. Information only flows in a single direction. These leaders are typically very hard to schedule time with, they are always traveling, visiting with other leaders within the business. Sound familiar?

When something important in the business happens, a member of the team will communicate it up to the leader. The leader may or may not pass on that information to others. In many cases, if they do, it’s to an “inner circle” of people in a small “clique”….those who have an immediate need for that information. Others find out through “the grapevine” later, if they are lucky. If there is no real forum for communication, a broadcast of and discussion about this information is challenging. Sound familiar?

Take a large IBM-type of organization where there are several lines of business (LOB), regions with geographic leadership, and industries with domain-specific leaders. This is a very typical matrix model for large and mature companies. So lets study a dysfunctional model a little more.

TraditionalLeadershipModel

In this model, an LOB leader will take it upon themselves to engage with other LOB leaders, regional leaders, and industry leaders…depending on the need, and/or who has more political power and knowledge. These LOB leaders can speak very intelligently to their business, of course, and “manage up” very well. They educate themselves well on the aspects of the business, which they should. But it usually stops there.

In a dysfunctional organization, the rest of the organization is clueless, and is always trying to “read the minds” of their leadership, playing constant catchup. Some succeed in this type of organization by becoming part of the “inner circle” or being aggressive in their attempts to align. Efficiency is low and execution marginal. Direct reports get frustrated, many leave, and those who remain are heard saying, “I haven’t had a chance to meet about the quarterly objectives. I don’t know where he/she is…I think they are traveling to Italy, Switzerland, and then Turkey.” Sound familiar?

Don’t get me wrong. Getting out of the office and engaging the organization is good. Meeting with others in the organization and/or customers is absolutely necessary. In fact, if you’re not spending at least a third if not half of your time in front of customers, partners, industry, you’re much too “internally focused”. We use the phrases “outside in” and “inside out”.

As a leader; however, before you leave home, you have to make sure you take care of business at home first, providing your staff with the proper framework to promote communication, transparency….and in a way that creates the most high-performing team, and builds trust. A high-performing team trusts each other the same way a Navy SEAL team might trust each other when going into battle. When your team has the fundamentals in place, THEN go on your roadshows. But not until you have the right management system in place FIRST. This is where most leadership falls on its face…no matter how big or small the organization is.

Have you ever been in the position where you have new boss (a CEO in a small company, or a VP/EVP in a larger company) who spends their first nine months on the job doing an organizational assessment, devising a strategy, and then beginning the execution of their strategy….all with no formal one-to-one meetings, only a few senior team meetings where each direct report had 10 minutes to share status, a couple all-hands meetings, and maybe an annual offsite planning meeting with only two-thirds of the team? If you are reading this and saying to yourself, “So, what’s wrong with that? Seems typical to me.” You are in a great position to learn from Lou Gerstner, Mike Lawrie, Jack Shemer and the like.

The Right Management System

In my humble opinion, the Gerstner-type of organization has a management system which doesn’t allow this type of behavior to persist. So what would a high-performing, “next-gen” management system look like?

NextGenLeadershipModel

Here’s an analogy. Imagine an organization where the leader is a conductor of an orchestra. Imagine the orchestra where each musician is playing an instrument, but couldn’t hear each other…or intermittently heard each other….and they all couldn’t see the conductor from where they were sitting. The conductor might move into a line of sight to give them a quick bit of direction and then disappear. How might that sound?

Members of a high-performing team must be able to hear each other, and get regular direction from their leaders. Strong leaders enable communication and transparency and provide a system that forces collaboration between members of the team and collaboration with others outside their line of business.

In such an environment, knowledge transfer is key…NOT knowledge hoarding. Discussion and challenging each other openly and frequently is required, not forced communication through email or ad-hoc phone calls. A strong communication pattern allows the team to hold each other accountable to each other’s success, not a focus on themselves and “my P&L”.

When information enters into this system, especially important and timely information, that information travels extremely fast. Ask any member of the team and they are ALL in “the know”, not just one or some minor set of individuals. This team is high-performing and can address critical, time-sensitive issues fast. The level of trust is high, because there are no secrets. And equally as important, the team feels like they are a part of the change, impact, and a team itself. This builds a healthy culture.

When leaders don’t have a strong management system, the staff becomes dysfunctional at best.

Typical Excuses From Poor Leadership

You might have heard, “Why do we need 1to1s for our team? We have well-experienced executives and they can reach out to me when they need to discuss something important. Everyone has my cell phone and knows I’m available day, night, weekday, or weekend.”

Or maybe you’ve heard, “We can’t afford to have the senior team come together so often. Time is money. You should know what you have to focus on, and if you don’t, come and talk to me.”

“We can’t afford to have the global team together each quarter for business review and planning. We’ll do this once a year for budgeting, and can find time together during other events over the year.”

So what is the right time commitment and meeting cadence? What type of management system is appropriate? Personally, everyone who has worked directly with me over the past 15 years, has heard me say, I only need 5% of your time as a team on average. I have a very specific management system for CEO’s of startups (a role I’ve had many times, for many years).

Senior leaders of larger organizations; however, are expected to spend 20% and maybe up to 30% of their time internally focused. It becomes even more important that people have access to you, and to each other as a senior team.

From my experience, in a large IBM-like organization, I will spend 20-25% of my time in meetings which I can book a year in advance. Yes, A YEAR IN ADVANCE. That doesn’t mean that you set times in your calendar which never change (although a strong CEO will never reschedule), but you at least know what kinds of meetings and their cadence which you generally want to keep consistent (so others can plan their customer visits, vacations, etc. accordingly, and not have to cancel, rearrange, or augment).

Successful leadership starts with understanding this, and establishing the right meeting mix and cadence that drives communication and transparency.

An Example Management System

When I took over a business unit of a modest size on a Thursday, we were working as a team on our management system the following Tuesday. Why? Because I knew, without the proper structure in place, I would immediately become part of the problem….especially in a fast-moving, fast-changing organization. I knew that establishing the right amount of connective tissue among the team was the biggest gift that I could give them.

Lets study the example of an EVP that runs a Line of Business (LOB) of several thousand people with several component businesses, many geographic regions (a global company), with offerings which address multiple industries. As an EVP, you report to the CEO and are part of a larger “CEO office” with other EVPs. So how would one balance their time across their own LOB, other LOBs, regional leadership, industry leadership, and the CEO?

Screen Shot 2015-02-20 at 7.43.11 AM

I believe you need to target 25% of your time on creating a meeting structure that involves the “internal team”, leaving another 75% for customers, partners, industry, your own creative/strategic time and, of course, overhead (things like eating, traveling, preparing for meetings, email). So, with several thousand people under your leadership, your LOB team of 16 executives (Operations, CTO, Strategy, Sales, Marketing, HR, Regional GMs x5, Practice GMs x4, Offerings), here’s how my management system looks after tweaking things a bit:

  • Management Meeting Time: 24%
  • Customer Visit Time: 33%
  • Industry / Partner: 15%
  • Strategic/Self Dev: 10%
  • Other (Travel, OH): 18%

In this example, you have four categories of “management system” time:

  • Your own LOB Senior Leadership Team
  • Regions
  • Industries
  • CEO Senior Leadership Team

Here’s a deeper view into a proposed management system for those interested.

LOB Component of Your Management System

Your internal LOB meeting structure might look like (this is own your senior team):

  • EVP & Senior Team 1to1s: 30 minute, monthly calls with your direct reports.
  • Practice & Regional LOB 1to1s: 30 minute, biweekly calls between your practice and regional GMs (you force this to happen)
  • LOB Team Mtg – Issue Resolution: 60 minute, weekly issue resolution meeting with entire senior team
  • Daily Ad-Hoc Issue Resolution: 15 minute, daily standup dial-in calls (3x, not including Mondays or your team meeting days)
  • All-Hands: Monthly all-LOB-hands 60 minute update call
  • Quarterly Business Review / Strategy: 1Q – 3Q, 3-day offsite
  • Annual Strategy / Budgeting: 4Q, 3-day offsite

Believe it or not, this is 10% of your time! Imagine how your staff would feel, if you created this level of connection/communication and at the expense of only 10% of your total time budget! Most people agree with this, EXCEPT, the daily standups (more on that later).

Regional Component of Your Management System

Lets use the example of a US-based company where the “regions” are outside of the “Americas”, and include Central & Eastern Europe, South & West Europe, UK & Ireland, ANZ, AMEA, and India (7 regions all together). Here is a proposed meeting structure:

  • LOB & Regional EVP/VP Leadership 1to1s: Interlocks with regional peers spending 30 min per month per region (not incl f2f)
  • LOB Regional Staff & Regional Leadership 1to1s: Your own regional GMs and their local regional leadership (EVP/VP) meet 30min biweekly (you encourage this to happen)
  • Regional Staff 1to1s: Your own regional GMs and their immediate staff 30min biweekly (you encourage this)
  • Regional Team Mtg – Issue Resolution: Regional GMs operate an issue-centric meeting like yours, 1 hr weekly (you encourage this)
  • Quarterly Business Review / Strategy: Regions have their own QBRs (with in-region leadership)
  • Annual Strategy / Budgeting: Regional-specific involving your staff
  • All-Hands: Regional-specific involving your staff

In this case, you (the EVP) will not participate in all of the above. However, you’ll make sure these meetings are established. For example, if you have 1to1s with your regional GMs, but they do not have 1to1s with their local leadership, then you’re failing. It’s YOUR job to make sure the overall management system is in place.

For the EVP, this takes up 1.5% of your overall budget, not including actual face-to-face (f2f) visits to the region. However, when you take trips abroad for weeks at a time, I include this management meeting time as part of the 1to1s, customer visits, strategy, etc. You may plan to spend one week in a any given region, two regions per quarter as part of your “roadshow” calendar. However, those always involve a much broader agenda, and should not compromise the above for both you AND your team.

Industry Component of Your Management System

Engagement with industries might involve:

  • LOB & Industry EVP/VP Leadership Meetings: You will want 1to1s minimally each quarter with each of your industry leaders for an hour (review your quarterly business plan highlights with them…and I don’t just mean TCV, Revenue, and OI)
  • LOB & Industry Senior Teams: This is a 1/2-day as a part of the LOB QBR where you engage an industry leadership team and work them into business planning.
  • Quarterly Business Review / Strategy: These are the QBRs that the Industries hold themselves (you may join)
  • Annual Strategy / Budgeting: These are held by Industry leaders (who may ask you to join)

So, your Industry organizations will have their own QBRs and Annual budget planning meetings. What you want, is to steal a part of their time for your team to engage with the industries in a coordinated way. If you have 6 industries, how do you make sure that your organization touches ALL of them in a thoughtful way (ideally, you don’t want uncoordinated / random engagement with others)? I, personally, like to make sure that my team touches 2 industries per quarter as part of my QBR. Therefore, I invite last least one Industry group (or up to two) to my QBRs. That means that a 3-day QBR agenda would have 2 days of LOB/Practice+Regional content, and 1 day of Industry content. Novel huh? Forced cadence for the organization as whole. Most “leaders” don’t get this. They just leave it up to their senior team to engage the organization randomly (why not, they are well paid executives…who should be able to figure this out, right?).

For the EVP, this equates to a little over 2% of your time budget.

CEO-Office Component of Your Management System

For the EVP, plan on another 9%+ of your time budget meeting with the CEO. Meetings might involve:

  • Operational Review: Going over the LOB P&L metrics
  • Decision Committee: Proposing and agreeing on core changes needed across the organization
  • Investment Review: Where you make your R&D investments
  • Key Client Review: Agreeing on the proposed contracts for large deals
  • Sales/Delivery Excellence: How to improve winning and delivering
  • Deal Committee: M&A opportunities
  • Alliance Review: Partnership traction/strategy
  • HR Steering: Cultural/talent change
  • Financial Review: Managing costs
  • CEO & EVP/VP 1to1s: Real-issue discussions

You many not be involved in all of these meetings, but in a healthy organization, the cadence is strong. In a high-performing organization, the cadence is not only strong, but the dialog is hard..meaning that the CEO teases out the real issues from the team, and doesn’t just spend time talking, or asking their staff to provide status.

Your 1to1s can be the most important

As one example, lets dive into the mechanics of 1to1 meetings. First of all, this includes EVERYONE. I don’t place any less or more value on the role around the table. That means that HR and Marketing are at the table. In a dysfunctional organization, many of the “shared” functions are left out of the “strategic” meetings.

Therefore, in my example this includes Operations, CTO/RD&D, Strategy, Sales, Marketing, HR, Regional execs, Practice leads, and Offerings. In my example, this consists of 16 people and, to some, might seem unmanageable. This is a huge time commitment for just 30min every two weeks. You might hear yourself asking, “Why can’t I do this in an ad-hoc basis”?

This meeting creates the connection between you and your team in a way that is hard to value at first. I argue, that this can NOT be replaced with ad-hoc meetings over lunch, in the bathroom, in the hallway. No, this is a meeting where you talk about the topics that are important to your members, individually…and it can range quite a bit, meeting to meeting.  Topics may include:

  • Performance Rating / Comp (are you feeling valued?)
  • Objectives for the Quarter (obstacles to accomplishing?)
  • Career Goals (how to develop / invest in yourself?)
  • Peer issues (problems with others in the org?)
  • Company blocker / issues (things in general I can help change?)
  • How can I improve?
  • Tactical Items (hot topics?)

I keep copious notes on my 1to1s that I refer back to each time I have a meeting. It’s like having our own personal “psych session”. Sometimes for me, sometimes for the team, sometimes for both of us. I also make any/all conversation fall into the “cone of silence” where it never goes beyond us, unless told otherwise. These occur biweekly to once per month, depending on the size of the organization.

“Real Issue” Team Meetings

Another “pet peeve”….how to orchestrate your own team meetings. Do you simply go over quarterly status week to week? Do you ask each of your team members to spend 10 minutes, and simply provide them the “microphone” in a “stand-up” kind of way? I think this meeting time is one of the most challenging…because to get your team to “open up” with what is bothering them….what are the key obstacles in their way, is the difference between “management” and “leadership”. Read the five dysfunctions of a team, and get back to me.

Here is how I like to run my weekly senior team meetings (I call them “roundtables”).

Agenda:

  1. Good news check-in / Larger announcements or news
  2. Anything holding us back on our quarterly objectives?
  3. Discussion around “Real Issues”
  4. Documenting critical action items

If time permits:

  • Top priorities for next week
  • Customer and employee hassles not already covered
  • Discussion around overall quarterly status
  • Key events coming up

But the key here….is focusing the majority of the time on “Real Issues”. I, to date, have NOT seen executives focus on this…so I have to explain this a little to make a case for this meeting style.

Definition of a “Real-Issue”:

  • A topic that would make your stomach linings churn, if brought up as a team
  • Something that you are uncomfortable talking about (especially as a team)
  • Event(s) which are affecting your staff and/or organization negatively

Why do executive meeting needs to address “real-issues”?

  • Teams (companies) fail based on process (team dynamics) not content (what is actually being talked about)
  • Every team “hits a wall”. Great teams work through the “real issues”.
  • Every “real issue” that has the potential of “blowing the team apart” is exactly what makes it stronger.
  • Reality always wins. It’s our job to get in touch with it.
  • There are no secrets in teams, just dysfunctional dynamics thinking so.

Getting your team to share the “real issues” takes time, because it requires them to trust you and each other. This is a topic in of itself.

What do you think? What tricks have you learned and applied to establishing your management system?

A great reference from one of my past executive team members: Roger Neirenbereg at TEDx on “The Music Paradigm”. This is AN ABSOLUTE MUST WATCH.

 

Posted in Leadership.

Tagged with , , , , , , , , , , , .


Where Did My Love For Data Start?

teambynet

 

Bynet Team Photo (left to right, top to bottom) Top Row:  John WrightJim (Hjerpe) Kaskade, Sumit Sharma, Dennis RussellBob McMillen, Bottom Row: Paul MichelettiLawrence LadaoSerdar YilmazBob MoussaviDoug Hundley.

Jack Shemer & David Hartke – True Legacies

Whenever Jack visited me, he used to leave sticky notes on my desk with nuggets of wisdom. For example, “Keep people you trust close to you.”…or, “Key values for Teradata where: Pride, Enthusiasm, Importance of the Individual, Teamwork and Open Communications, Ethics, Dedication, Quality, Support, Success, and Entrepreneurship.”

In the month of July, 1999, Jack Shemer and David Hartke both decided to come out of retirement to help me and my team start a new company, INCEP (along with a few other veterans of the industry including Art Collmeyer, Bob Adams, and Phil Paul). Little did I know, Jack would not only “give me my wings as a CEO”, but he started a process which ended up transforming me, creating the value system I use today.

“Initial Partner Presentation 1980, prior to Beta test in Dec, 1983″ was the note he wrote on his initial Teradata business plan, which I still have today. Inside was a copy of a less formal “Preliminary Business Plan” dated April, 1980. Jack (CEO) and Phil Neches (CTO) where both on the “payroll” then (with only $175K of seed capital later to be augmented with institutional money from Brentwood Associates run by Tim Pennington and Kip Hagopian).  Co-founders David Hartke (Engineering) and Jerry Modes (Finance) planned on leaving their current day jobs within then next month (after their first true round of financing). With funding they could bring the entire founding team together plus a few project leaders.

Their first milestone was the “demonstration of a complete, working hardware prototype” within 18 months (December 1981). Jack was asking for only $2.5M of initial venture funding to carry the team through milestone 1, and another $3M to get to the “first system ready for shipment to a customer” by December 1982 (month 30). They eventually closed a Series A of $2.6M on July 23, 1980, and subsequently raised $12M in December 1981, $12M in January 1983, and another $40M over three additional rounds in 1984, 85, and 86. Teradata IPO’d August 1987 raising $37.5M of public capital.

YNET

Few people know that the backbone of the Teradata Database Computer (DBC) was originally referred to as the HINET (“High Speed Network”), later renamed to the YNET, and then redesigned as the BYNET.

The DBC1012 was designed to attach to existing mainframe and mini-computers to provide a substantial increase in system throughput, response time, ease of use, and reliability….using a relational model DBMS. The target was to increase throughput by as much as 4 times that of IBM’s IMS, and support two orders of magnitude in data base size and processing power. [Note: I'll compare this approach and the current approach of Hadoop from providers like Hortonworks and Cloudera in a future post.]

The YNET was engineered to interconnect up to 1024 processor modules (Interface Processors or IFPs, and Access Module Processors or AMPs) in a distributed, shared nothing, Massively Parallel Processing (MPP) configuration. The YNET was originally envisioned to support broadcast and sorting, allowing for linear scalability (e.g. performance improvement does not degrade with added process modules). Believe it or not, the system was engineered to scale from 1.5 to 512 MIPS (yes, back in 1980 only 512 MIPS).

Fast forward, in 1990, a team was formed in a joint-development between NCR and Teradata. The project code name was “P90″ and it consisted of an elite team of 100 engineers from Teradata and 100 engineers from NCR, who were placed in an abandoned building in Torrey Pines, San Diego. Our charter was to “kill IBM” by producing the most powerful next-generation database system in the world.

At the time, the YNET still provided for communication among all processors (AMPs, IFPs, and COPs – the COP had the same functions as the IFP, but was used to communicate with network attached DOS-PC/UNIX hosts). The YNET always operated in a broadcast (one-to-all communication) mode and the two YNETs (primary and backup) had a total system bandwidth of 12 MBPS at the time.

It was well understood that Jack Shemer and David Hartke’s invention, the YNET, would easily support 200-300 processors using 80386 Intel CPUs (rated at 4 MIPS each). However, scaling above 512 next-generation processor modules (rated at 100MIPS each) would result in the YNET becoming a bottleneck (the network would become the limiting function of scale).

BYNET

So we embarked on a journey to develop the next-generation YNET that could scale to 4096 high-performance nodes, where we could easily support 10 MBPS  PER PROCESSOR MODULE, linearly scalable up to 4096 processors (vs. 12 MBPS in total on the network). Thus, a 512 node system would support bandwidth up to 10.2 GBPS.

The other breakthrough was creating a network that allowed processors to communicate either point-to-point, multicast, or broadcast. This design leveraged concepts from the Banyan Crossbar Switch, where the network is constructed from the modular switch node building block. In the case of the BYNET, we created a switch node where it was an 8×8 crossbar that can connect any of its eight input ports to its eight output ports simultaneously, arbitrating when conflicts arise. It operates very similar to that of a telephone network.

A sender (one of the many Teradata processors) “dials” a receiver (another processor) by sending a connection request to the network. The request contains an address or “phone number” which is interpreted by the switch nodes. Once the connection goes through, a circuit is established that is held for the duration of the “call”. To support up to 4096 nodes, a folded indirect n-Cube topology was modified. There was no such network topology known at the time like ours, but generally it was in the Banyan class of topologies.

A folded network was chosen to support packaging large networks. Because this was a database machine with large amounts of data being routed between nodes, a circuit switched network (vs. packet switched) was implemented. The BYNET has no single point of failure with redundant paths between every input and output. The BYNET guarantees delivery of every message and ensures that broadcasts get to every target node. So the database isn’t plagued by communication errors or network failures and does not have to pay the price of acknowledgements or other error-detection protocols. This part of the Teradata system was truly disruptive.

Behind me and the team in the above picture is a 256 node Teradata 3700 database system, circa 1992.

This is where my love of data started.

[Note: This BYNET team was responsible for the new Teradata 3700 network architecture, BYNET protocol, BYNET switch node, BYNET I/O processor, BYNET Interface Controller Board, BYNET Type-A, B, and C Boards, BYNET CMA/A and CMA/B Backplanes.]

Posted in Data.

Tagged with , , , , , , , , , , , , , , , .


Internet of Things – Definitions & Predictions

Print

Remember the days when we were debating the definition of “Cloud”? That was as recent as 2009. Fast forward to 2014 and we’re facing the same ambiguities with the Internet of Things (or IoT).

It’s a market that is as big or bigger than Cloud. IDC expects the overall market for IoT to grow at a 12.5% CAGR from $1.3 trillion in 2013 to $3.0 trillion in 2020. IDC also forecasts that there will be approximately 30 billion autonomous things attached to the Internet in 2020, which serve as the catalyst driving this significant revenue opportunity.  IDC believes that services and connectivity will make up the majority of the IoT market — outside of intelligent systems; together, they are estimated to account for just over half of the worldwide IoT market in 2013. IDC expects that by 2020, this percentage will inch up to 66% of the worldwide IoT market (outside of intelligent/embedded systems) but give way to the increasingly valuable platforms, applications, and analytic services that are forecast to together equal 30% of the total market revenue.

It’s a big opportunity….IoT. So what is it exactly?

Wikipedia IoT Definition

Screen Shot 2014-11-15 at 5.35.36 PM

Wikipedia defines the Internet of Things (IoT) as the interconnection of uniquely identifiable embedded computing devices within the existing Internet infrastructure. Typically, IoT is expected to offer advanced connectivity of devices, systems, and services that goes beyond machine-to-machine communications (M2M) and covers a variety of protocols, domains, and applications. The interconnection of these embedded devices (including smart objects), is expected to usher in automation in nearly all fields, while also enabling advanced applications like a Smart Grid.

IDC IoT Definition

IDC_Logo-square

IDC defines the Internet of Things as a network of networks of uniquely identifiable endpoints (or “things”) that communicate without human interaction using IP connectivity — whether “locally” or globally. IDC has identified the IoT ecosystem as having the following piece parts — intelligent systems, connectivity, platforms, analytics, applications, security, and services. While the overall market opportunity is represented in terms of billions of connected things and trillions of dollars in revenue opportunity — the question continuously asked is where the revenue opportunity lies across these different technology layers.

Other key aspects of IoT:

  • The IoT brings meaning to the concept of ubiquitous connectivity for businesses, governments, and consumers with its innate management, monitoring, and analytics.
  • With uniquely identifiable endpoints integrated throughout networks, operational and location data, as well as other such data, is managed and monitored by the intelligent or traditional embedded system that has been enhanced and made part of IoT solutions and applications for businesses, governments, and consumers.
  • IoT is composed of technology-based connected solutions that allow businesses and governments to gain insights that help transform how they engage with customers, deliver products/services, and run operations.

GigaOm IoT Definition

Identity.Gigaom09.Logo_GigaOm defines the internet of things as an ultra-connected environment of capabilities and services, enabling interaction with and among physical objects and their virtual representations, based on supporting technologies such as sensors, controllers, or low-powered wireless as well as services available from the wider internet.

Internet-connected objects, devices, and other “things” are proliferating in every domain.

  • Farmers’ gates can be fitted with SIM cards to monitor whether they have been left open or allow farmers to close them remotely. Cows are being equipped with pay-asyou- go devices, which can send SMS texts when they are in heat.
  • Beer barrels now have radio tags so that they can be tracked from brewery to bar and back. Indeed, few supply chains exist today without some kind of automated product tracking. Many major supermarkets now offer bar-code readers to self-scanning shoppers, for example.
  • Startups such as Supermechanical and Electric Imp are creating monitoring devices that can be connected to light bulbs or other electrical devices, garage doors, or windows or simply left in the basement to check for water leaks.
  • “Things” don’t necessarily have to be small: Buses, trains, and cars can be fit with monitoring devices so they can provide accurate information to both control rooms and customers.

Gartner IoT Definition

gartner-logo-blue-sq

Gartner defines the Internet of Things (IoT) as the network of dedicated physical objects (things) that contain embedded technology to sense or interact with their internal state or external environment.

The IoT comprises an ecosystem that includes things (e.g. baby monitor), communications (e.g. home broadband), applications and data analysis (obstructive sleep apnea prevention).

Machine-to-machine (M2M) communication services refer to connectivity services that link IoT “things” to central or back-end systems, without human input. Operational technology (OT) is enterprise technology used to monitor and/or control physical devices, assets and processes.

Screen Shot 2014-11-16 at 6.44.36 PM

Industrial IoT vs. Human IoT

Screen Shot 2014-11-16 at 10.45.09 AM

I like how Moore Insights and Strategy defines IIoT vs. HIoT. Designing for IIoT requires deep understanding of solution spaces and an ability to connect systems manufactured many decades apart. IIoT favors solutions vendors such as DIGI, Echelon, and Freescale, who have solid roots in the industrial control world.  HIoT favors fast moving prototyping driven by leaps of faith in user experience (UX) and device design, exemplified by the Maker community in particular, led by vendors such as Apple and Microsoft.

IoT Predictions

Gartner predicts that, by 2020, the installed base of the IoT will exceed 26 billion units worldwide and will consist of a very diverse range of smart objects and equipment. Between 2014 and 2020, the number of connected objects will grow explosively and few organizations will escape the need to deliver applications that link smart objects and equipment to their corporate IT systems. However, systems involving the IoT will be very different from conventional IT applications, so IT leaders must act now prepare for this future.

IDC estimates that as of the end of 2013, there were 10 billion IoT units installed — with IP connectivity and communicating autonomously. IDC predicts that the installed base of IoT units to grow at a 16.8% CAGR over the forecast period to 29.5 billion in 2020 (see Table 3). Enablers for the impressive growth rate over the forecast period include but are not limited to pervasiveness of wireless connectivity, ubiquitous access to the Internet regardless of location, IoT standard protocols, government support for efficient technologies and services, innovation happening in a new segment of the tech market, and business process efficiencies and consumer realities around a connected lifestyle.

IDC’s 2014 predictions for IoT are as follows:

  1.  IoT Partnerships Will Emerge Among Disparate Vendor Ecosystems
  2. Leaps of Faith in 2014 Will Create End-to-End IoT Solutions
  3. Open Source-minded China Will Be a Key Player in the IoT
  4. “Plumbing” of the IoT Will Attract Significant Activity in 2014
  5. IoT Will Come to Healthcare in 2014
  6. Mobility Software Vendors Will Continue to Show a Lack of Interest in IoT
  7. Worldwide “Smart City” Spending on the IoT Will Be $265 Billion in 2014
  8. A Smart Wearable Will Launch & Sell More Units than Apple or Samsung Wearables on the Market in 2014
  9. IoT Security Is a Hot Topic, But There Will Be No Heat Until There Is a Fire
  10. Professional Services Will Open Up the IoT Competitive Landscape
  11. Big Data Will Drive Value Creation from IoT

Areas of Opportunity?

internet-of-things-gartner

Gartner believes that opportunities by revenue potential include:

  • Indoor Lighting
  • Media Consumption
  • Connected Cars
  • Smart Pills
  • Patient Monitoring
  • Smart Meters
  • Smart Computer Accessories / Home networking devices
  • Alarms & Security
  • Street & Area Lamps
  • Digital Cameras
  • Fitness/Sports
  • Thermostats
  • Toys
  • Other (Parking, Digital Signage, Trash, ATMs, home appliances)

Areas of particular interest based on GigaOM research include:

  • Health care is already making use of telehealth systems and services, an area likely to grow substantially over the coming years both inside hospitals and across community service delivery.
  • Agriculture is looking to combine sensor data (such as soil analysis) with environmental data, satellite imaging, and so on.
  • Physical retail is known to be struggling, particularly in light of lower-margin ecommerce. The future of physical retail lies in delivering improved experiences to customers, enabled by the internet of things.
  • Public safety and defense can benefit from the increased use of sensors and monitoring, combined with information from broader sources (environmental, geospatial, and so on).

Technology Business Research (TBR) believes lucrative returns will not come from sales of Internet-ready fashion, appliances or connected home electronics. TBR expects the IoT market over the next two years to be incredibly turbulent as customer adoption of new devices will be hit or miss, and device companies will invest hundreds of millions of dollars in product development with little return other than learning which devices do not appeal to customers. TBR believes health and fitness presents the largest opportunity, followed by personal safety. Ezra Gottheil believes that consumer IoT will complement and often lead the commercial IoT, and together, they will fuel a wave of innovation and expansion in all segments of IT. Jack Narcotta believes that the biggest upside will come from the platforms device vendors will construct. New IoT-ready platforms will enable vendors to embrace the first generation of IoT devices and allow the devices to intercommunicate with vendors’ current respective ecosystems. New platforms will also effectively future-proof vendor IoT strategies from the effects of fickle customers, peaks and valleys in demand for specific form factors, and the introduction of new protocols and technical standards.

IDC believes that there is no one vendor that will emerge as a winner. This market will involve a collection of vendors, service providers, and systems integrators that need to coexist and integrate products and solutions to meet the needs of customers — enterprises, governments, and consumers.

Screen Shot 2014-11-16 at 8.54.11 AM

IDC believes that the opportunity is being led by the IT hardware vendors, followed by software.  This is probably based on the fact that hardware spending in general, about 40% of total IT spending, drives downstream spending in software and services.

Screen Shot 2014-11-16 at 10.23.01 AM

So where does one focus? B2C or B2B markets? Both? IDC sees almost an even split.

B2C?:

  1. Security
  2. Children/Pet safety
  3. Energy
  4. Health & Fitness
  5. Smart Appliances

B2B?:

  1. Inventory Management
  2. Fleet Tracking/Diagnostics
  3. Shipment Monitoring
  4. Security
  5. City Systems (Parking, Street Lamps)

But $3T by 2020? That’s actually down from their $7.1T 2020 revenue forecast in their Worldwide and Regional Internet of Things (IoT) 2014–2020 Forecast: A Virtuous Circle of Proven Value and Demand, IDC #248451, back in May 2014. The change in forecast was due to how IDC weighted which use-cases grew faster. The high-end IoT scenario involves systems that have advanced monitoring and analytics. Mid-high-end scenarios have rich “track and trace” capabilities with exception-based reporting + prediction. Mid-low-end scenarios involves exception reporting, but without prediction. Low-end scenarios are only simple “track and trace”…no exception handling, and no prediction.

A “thing” categorized within the “low-end scenario” could generate $2 per month with a temperature sensor in a carton/container, generating $24 per year (e.g. tallying the goods moved from Asia to the US via boat freight). The “high-end scenario” could generate $100 per month with a monitor in a critical care unit, generating $1,200 per year (e.g. providing prescriptive capabilities around sepsis – which is the most expensive condition treated in hospitals, accounting for over $20 billion in annual costs to the U.S. healthcare system).

Screen Shot 2014-11-16 at 10.13.39 AM

At Directions 2014, IDC’s Carrie MacGillivray, talked about which industries were leading the charge with IoT based on the Value vs. Volume of the devices in their connected device networks. Insurance, Retail, and Transportation are considered the established IoT sectors, while Manufacturing, Consumer, and Utilities are the most promising in terms of growth.

Screen Shot 2014-11-16 at 10.38.31 AM

Vernon Turner‘s view is that Intelligent Systems (30%) is going to be the largest segment in the stack. Vernon also believes that companies like Bosch and GE will lead on Industrial IoT, while companies like Apple and Microsoft will lead on Human IoT.

Posted in Data, IoT.

Tagged with , , , , , , , , , , .


IoT is the new Big Data?

IOT

My favorite writer, Gil Press, sums it up with, “It’s Official: The Internet Of Things Takes Over Big Data As The Most Hyped Technology” where he talks about how Gartner released its latest Hype Cycle for Emerging Technologies, and how big data has moved down the “trough of disillusionment,” replaced by the Internet of Things at the top of the hype cycle.

The term Internet of Things was coined by the British technologist Kevin Ashton in 1999, to describe a system where the Internet is connected to the physical world via ubiquitous sensors.

Today, the huge amounts of data we are producing and the advances in mobile technologies are bringing the idea of Internet connected devices into our homes and daily lives.

Definition of IoT Has Expanded

Internet of Things was a popular concept dating far back to articles like  Scientific American in 2004. RFID and sensor technology enabling computers to observe, identify and understand the world—without the limitations of human-entered data.

However, I think people took it beyond the capture of “physical” events/data. I think Kevin Ashton envisioned a network of things that originally was wholly dependent on human beings for information, and then expanded to involve anything that touched a person (physical or not), passing from machine to machine.

Capturing the behavior of people will require a much broader collection of data beyond just sensor technology…beyond the “physical” – whether that is web server clickstream data, e-commerce transaction data, customer service call logs, search logs, video surveillance,  documents, etc. There is much more than “physical” or sensor-only data that involves the customer.

To truly begin understanding the behavior of people, you need to capture data from any touch point, gaining a holistic view of that person. Gaining a 360 degree of your customers, or a 360 degree view of your business by leveraging an environment of structured and unstructured data that can be analyzed….M2M (Machine to Machine) and/or IoT (Internet of Things) involving physical devices becomes a subset of the data sources available to such a project.

Is IoT a subset of Big Data or Visa Versa?

I was talking to the head of Big Data & Analytics at SAP (a peer to CSC’s Big Data & Analytics), David Parker, regarding IOT vs. Big Data. Their management has established a new IOT business unit, which I guaranteed David, would be addressing similar business use cases as his Big Data team at the end of the day.

Screen Shot 2014-08-18 at 1.59.46 PM

Last year Mukul Krishna, from Frost & Sullivan, presented a simple incremental view of how IoT feeds Big Data which then feeds a broader analytic platform. Think of IoT as a bunch of customized data sources (typically machines and sensors) leveraging customized collectors that feed a comprehensive platform (e.g. Hadoop vendors like Cloudera and Hortonworks) which, in turn, allow us to feed downstream analytic, BI, and visualization platforms.

Are Sensors the Core of IoT?

A sensor is technically any device which converts one form of energy to another form, the end usually being an electrical form mainly for measurement, control or monitoring purposes.

Take a typical temperature sensor like a gas pressure based tube sensor which expands or contracts to convert the temperature into a mechanical motion which can be displayed, recorded or used for control as required. Translation….I just described a thermostat as used in a refrigerator.

The raw electrical signal from a physical sensor is usually in analog form, and can be conveniently processed further and displayed on a meter or other suitable indication device or recorded on paper or other media such as magnetic tape or more advanced digital systems as required.

The sensor is typically classified as per its application and there could be many different types of sensors, with their own inherent advantages or disadvantages for a particular application. Putting it simply, the sensor generates an output which can be conveniently displayed, recorded or used to control or monitor the application at the point where the sensor is installed.

What’s so special about sensors? You can translate the analog physical world into a digital computer world. We convert the sensor’s analog signals into digital signals so that the computer can be able to read it, and then we feed that with other digital signals into a Big Data platform.

“Technologies that operate upon the physical world are a key component of the digital business opportunity.” as described by Gartner. Many of these “physical sensor technologies” may be new to IT, but they are expected to be high impact and even transformational.

I think IoT requires a lot of talent on the many types of physical sensors and how they are ultimately converted into a form that the emerging Big Data platforms can consume and analyze.

IoT needs a Big Data Platform

Getting your plants or your fridge to talk to you through sensors is one thing. Getting your plants to talk to your heating system is quite another. As we map the spread of IoT, it starts to get more complicated and barriers appear with a centralized big data analytics platform, or lack thereof, likely to halt progress.

Jeff Hagins Founder and CTO of SmartThings, described the data platform he has been working on that should help expand IoT and help product designers work out new ways of connecting machines and people.

He believes that the Internet of Things has to be built on a platform that is easy, intelligent and open. I argue that the evolving Big Data platforms introduced by the web scale giants like Google, Yahoo!, Linkedin, Twitter, Facebook and the like will become a standard for IoT-based applications….and IoT is just that, a set of specialized sensor connectors or sources coupled with a Big Data platform that enables a new generation of applications.

The blurring the physical and virtual worlds are strong concepts in this point. Physical assets become digitalized and become equal factors in understanding and managing the business value chain alongside already-digital entities, such as enterprise data warehouses, emerging big data systems and next-generation applications.

What do you think?

A few companies in the IoT space:

A few interesting videos to watch

Posted in Data.

Tagged with , , , , .


Avoid the Spiral

Double_Spiral_1280x1024

I ventured out of the “big company” environment back in 1998. It was 15 years later when I found myself back in the big company environment – a $13B revenue company after my startup was acquired.

As part of an executive team of 18 at a publicly-traded company, the environment could be considered a lot different when compared to any of the eight startups I was involved with prior. However, the reality is that it is not.

A good company environment is made up of the same factors, regardless of company size.

Even though I find it challenging these days to make time to reflect on the changes I need to make to improve my own leadership….I know that if I don’t, I will fall into a trap…a spiral.

Creating a “good” company environment, or in my case, a good business unit environment, may not be that important when things are going well.

When things are going well, the staff is excited to be working at the company because:

  • Their career path is wide open with lots of interesting jobs that naturally open up.
  • You, your peers, your family, and even your friends all think you are lucky for choosing and being a part of such a success.
  • Your resume is getting stronger by working at a company during its boom period.
  • It’s most likely lucrative with variable compensation plans paying off, bonuses being given, equity growing in value.

However, when things are challenging…and your business is struggling…all those reasons become reasons to leave.

The only thing that keeps an employee at a company when things are challenging is that people actually like their job.

Having worked and led staff through the toughest times of company life and death – including things like working for free, working long days and nights, working on weekends – I know what can be asked of a team when times are tough. But no team is going to respond to your requests of sacrifice for long, if they are working in a bad company environment.

In bad company environments, good employees disappear. In highly competitive and quick to change technology companies, disappearing talent starts the spiral.

When your company’s most important assets leave (your top performers), the company struggles to hit its numbers; it tries to backfill its core talent but can’t recruit it fast enough; it misses its milestones; declines in value; loses more of its key employees.

Spirals are extremely difficult to reverse.

So, yes, creating a “good” company environment isn’t that important when things are going well….but it sure the hell IS important when things go wrong.

…and things always go wrong.

I personally come to work every day because of the people first…..then the adrenalin fix I get from the business sector I’m in….and, finally, the technologies and products we can produce to disrupt the market….in that order.

Staying away from the Spiral

In great organizations, people focus on their work and they have almost a tribal confidence that they can get their job done. Good things happen for both the company and them personally.

You come to work each day knowing that your work can make a difference for the organization as well as yourself….motivating you and fulfilling your needs to support the sacrifice – the long hours, the missed kid birthday parties, and the canceled date nights with your spouse.

In poor organizations, people spend much of their time fighting organizational boundaries and broken processes. They are not clear on what their jobs are, so there is no way to know if they are getting the job done properly or not.

In some cases, because of pure will, your star performers will work ridiculous hours and deliver on their promises…but they will have little idea what it actually means for the company or for their careers.

To make it worse, when your star performers voice how screwed up the company situation is, management still denies the problem, defends the status quo, and, frankly, ignores the fact that they are dealing with people…not just quarterly goals, revenue targets, and operating income…again, only something you see in a poor company environment.

So how does one create a “good company environment”. For me it’s as simple as breathing air….it comes down to “telling it like it is” with: 1) transparency, and 2) strong communication….and I don’t mean detailing quarterly results with internal webex all-hands.

I like to personally “go out on a limb” by exposing the truths….by being personally vulnerable….ultimately, leading the team in a way that establishes a level of trust. I do this with a healthy dose of transparency and communication.

And in order to get the level of trust I need, one can’t emphasize the level to which you have to provide transparency and communication…..as a leader, you’ll feel uncomfortable with it when you are approaching the level that truely earns trust.

I have many techniques that I use to empower, not only my senior team, but my entire organization with the proper communication patterns only found in any good company environment….and needed to weather personal and professional storms.

Here are a few examples as I reflect over the past 15 years:

  • 1998: 50% marketshare erosion in one year due to changes in the customer ecosystem
  • 1999: A dysfunctional leadership team that can’t run the business
  • 2000: Dot com bust leading to a 2x increase in sales cycles
  • 2001: 911 requiring a 50% staff reduction
  • 2002: Your lead customer cancels their largest product line, on which you have bet the whole company
  • 2003: You enter into a patent war with your largest customer
  • 2004: Your leading acquisition transaction falls through with only months of capital left
  • 2005: Your co-founder and CTO loses his 1 and 3 year old sons in a car accident
  • 2006: A disruptive player enters the market, fundamentally changing the landscape
  • 2007: Your services go offline, crippling your top ten customers and impacting their businesses significantly
  • 2008: An acquisition candidate does an “end-around” going after your key engineering talent
  • 2009: Your co-founder and CTO gets “cold feet” and can’t commit prior to securing your next critical round of financing
  • 2010: The board of directors pulls funding right after you hired the “A-Team” & begin to ramp sales
  • 2011: Your product launch is significantly delayed (12 months) due to a fatal flaw in the technology
  • 2012: You realize that your original product that was 3 years in the making has to be thrown away
  • 2013: Investors back out with only 4 weeks of cash flow left

During this time, my father had heart surgery, my mother was diagnosed with leukemia, my wife had postpartum depression, my oldest son was diagnosed with dyslexia, I was diagnosed with a blockage in my left coronary artery*, we had to sell our perfect home in order to keep the startup funds coming…and the personal list goes on.

My question for you:

Do you have the type of company environment where you have the support needed to weather any business or personal storm?

*False positive, thank god.

Posted in Leadership.

Tagged with , , , , , , , , .


Leadership Means Sacrificing

Screen Shot 2014-04-24 at 3.55.53 PM

What do Special Forces, Army Rangers, Navy SEALs, and Marines all have in common?

Teams like these go through what is considered by some to be the toughest military training in the world.

They also encounter obstacles that develop and test their stamina, leadership and ability to work as a team like no other.

I was talking recently to a colleague of mine about some our own own leadership at work. Emotions were strong. Deep sighs punctuated every other sentence.

We’re going through a business transformation, and as with most company turn-arounds, there is a strong conflict between the “old” and the “new”. This means old vs. new target markets, old vs. new business processes,  old vs. new people…and, at the core of most issues, old vs. new culture.

This colleague is part of the “new team”, chartered to help create change.

“I struggle with some of the leadership.”  he said, which reflected a general theme throughout the conversation.

This reminded me of the book, Fearless, the story of Adam Brown, a Navy SEAL, who sacrificed his life during the hunt for Osama Bin Laden.

Strange to think about the military when talking about business, since these two worlds couldn’t be the furthest from the other….or are they?

What Kind of Leadership Would You Prefer?

When Navy SEAL Adam Brown woke up on March 17, 2010, he didn’t know that he would die that night in the Hindu Kush Mountains of Afghanistan.

Who risks their lives for others so that they may survive? Heroes like Adam Brown do. You’ll find that military personnel are trained to risk their lives for others so that they may survive.

Would you want to be a part of a team with people who are willing to sacrifice themselves so that others like you may gain? Who wouldn’t?

In business, unfortunately, we give bonuses to employees who are willing to sacrifice others so that the business may gain.

I don’t know about you, but most people that I know want to work for an organization in which you have ABSOLUTE CONFIDENCE that others in the organization would sacrifice…so that YOU can gain…not them, not the business.

And guess what? The leadership and the business end up gaining in the end….because they have a workforce that doesn’t waste its time always looking over its shoulder, wondering what is going to happen next.

A Winning Culture

In my work to create high performing teams, I look for the type of business collegaues who are more like Adam Brown…the ones who sacrifice for the good of the team, not themselves. We want people who value this. This isn’t negotiable.

I want the team to know that I will GO OUT OF MY WAY to improve their well-being….that I care more about their success than my own. It’s not bullshit. Just ask anyone who has been part of a high-performing team….and you’ll probably hear the same.

” I care more about their success than my own”

Why? Because their success is our success. It’s that simple.

A winning culture is one where you have a team of people who are interested in improving each other…sacrificing their own interests in order to help the other.

In the end, you are NEVER looking over your shoulder…you are NEVER wasting energy trying to understand the mission. You’re focused, and you execute.

That’s a winning culture…a winning team…that’s leadership.

My collegue and I regained our enthusiasm as we reflected on our similar views. His last words still echoing in my head…

“One team, one fight. Unity is what brings the necessary efficiencies to fight effectively. Lack of unity creates unnecessary distractions from the objective at hand.”

Feb 2015 Update: See Simon Sinek’s video on “Why good leaders make you feel safe.” Not directly related, but similar use of military to emphasize a point.

Posted in Leadership.

Tagged with , , , , , , , , , , , .


Big Data Top Ten

 

Screen Shot 2013-12-20 at 7.52.56 AM

What do you get when you combine Big Data technologies….like Pig and Hive? A flying pig?

No, you get a “Logical Data Warehouse”.

My general prediction is that Cloudera and Hortonworks are both aggressively moving to fulfilling a vision which looks a lot like Gartner’s “Logical Data Warehouse”….namely, “the next-generation data warehouse that improves agility, enables innovation and responds more efficiently to changing business requirements.”

In 2012, Infochimps (now CSC) leveraged its early use of stream processing, NoSQLs, and Hadoop to create a design pattern which combined real-time, ad-hoc, and batch analytics. This concept of combining the best-in-breed Big Data technologies will continue to advance across the industry until the entire legacy (and proprietary) data infrastructure stack will be replaced with a new (and open) one.

As this is happening, I predict that the following 10 Big Data events will occur in 2014.

1. Consolidation of NoSQLs begins

A few projects have strong commercialization companies backing them. These are companies who have reached “critical mass”, including Datastax with Cassandra, 10gen with MongoDB, and Couchbase with CouchDB.  Leading open source projects, like these, will pull further and further away from the pack of 150+ other NoSQLs, who are either fighting for the same value propositions (with a lot less traction) or solving small niche use-cases (and markets).

2. The Hadoop Clone wars end

The industry will begin standardizing on two distributions. Everyone else will become less relevant (It’s Intel vs. AMD. Lets not forget the other x86 vendors like IBM, UMC, NEC, NexGen, National, Cyrix, IDT, Rise, and Transmeta). If you are a Hadoop vendor, you’re either the Intel or AMD. Otherwise, you better be acquired or get out of the business by end of 2014.

3. Open source business model is acknowledged by Wall Street

Because the open source, scale-out, commodity approach to Big Data is fundamental to the new breed of Big Data technologies, open source now becomes a clear antithesis of the proprietary, scale-up, our-hardware-only, take-it-or-leave-it solutions. Unfortunately, the promises of international expansion, improved traction from sales force expansion, new products and alliances, will all fall on deaf ears of Wall Street analysts. Time to short the platform RDBMS and Enterprise Data Warehouse stocks.

4. Big Data and Cloud really means private cloud

Many claimed that 2013 was the “year of Big Data in the Cloud”. However, what really happened is that the Global 2000 immediately began their bare metal projects under tight control. Now that those projects are underway, 2014 will exhibit the next phase of Big Data on virtualized platforms. Open source projects like Serengeti for VSphere; Savanna for OpenStack; Ironfan for AWS, OpenStack, and VMware combined, or venture-backed and proprietary solutions like Bluedata will enable virtualized Big Data private clouds.

5. 2014 starts the era of analytic applications

Enterprises become savvy to the new reference architecture of combined legacy and new generation IT data infrastructure. Now it’s time to develop a new generation of applications that take advantage of both to solve business problems. System Integrators will shift resources, hire data scientists, and guide enterprises in their development of data-driven applications. This, of course, realizes the concepts like the 360 degree view, Internet of things, and marketing to one.

6. Search-based business intelligence tools will become the norm with Big Data

Having a “Google-like” interface that allows users to explore structured and unstructured data with little formal training is the where the new generation is going. Just look at Splunk for searching machine data. Imagine a marketer being able to simply “Google Search” for insights on their customers?

7. Real-time in-memory analytics, complex event processing, and ETL combine

The days of ETL in its pure form are numbered. It’s either ‘E’, then ‘L’, then ‘T’ with Hadoop, or it’s EAL (extract, apply analytics, and load) with new real-time stream-processing frameworks. Now that high-speed social data streams are the norm, so are processing frameworks that combine streaming data with micro-batch and batch data, performing complex processors on that data and feeding applications in sub-second response times.

8. Prescriptive analytics become more mainstream

After descriptive and predictive, comes prescriptive. Prescriptive analytics automatically synthesizes big data, multiple disciplines of mathematical sciences and computational sciences, and business rules, to make predictions and then suggests decision options to take advantage of the predictions. We will begin seeing powerful use-cases of this in 2014. Business users want to be recommended specific courses of action and to be shown the likely outcome of each decision.

9. MDM will provide the dimensions for big data facts

With Big Data, master data management will now cover both internal data that the organization has been managing over years (like customer, product and supplier data) as well as Big Data that is flowing into the organization from external sources (like social media, third party data, web-log data) and from internal data sources (such as unstructured content in documents and email). MDM will support polyglot persistence.

 10. Security in Big Data won’t be a big issue

Peter Sondergaard, Gartner’s senior vice president of research, will say that when it comes to big data and security that “You should anticipate events and headlines that continuously raise public awareness and create fear.” I’m not dismissing the fact that with MORE data comes  more responsibilities, and perhaps liabilities, for those that harbor the data. However, in terms of the infrastructure security itself, I believe 2014 will end with a clear understanding of how to apply those familiar best-practicies to your new Big Data platform including trusted Kerberos, LDAP integration, Active Directory integration, encryption, and overall policy administration.

Posted in Data.

Tagged with , , , , , , , , , , , , , .


SAP & Big Data

Gartner_DW_SAP

SAP customers are confused about the positioning between SAP Sybase IQ and SAP Hana as it applies to data warehousing. Go figure, so is SAP. You want to learn about their data warehousing offering, and all you hear is “Hana this” and “Hana that”.

It reminds me of the time after I left Teradata when the BI appliances came on the scene. First Netezza, then Greenplum, then Vertica and Aster Data, then ParAccel. Everyone was confused about what the BI appliance was in relation to the EDW. Do I need an EDW, a BI appliance, an EDW + BI appliance?

With SAP, Sybase IQ is supposed to be the data warehouse and Hana is the BI or analytic appliance that sits off to its side. Ok. SAP has a few customers on Sybase IQ, but are they the larger well-known brands? Lets face it….since its acquisition of Sybase in 2010, SAP has struggled with positioning it against incumbents like Teradata, IBM, and even Oracle.

SAP Roadmap

SAP_Roadmap

SAP’s move from exploiting it’s leadership position in enterprise ERP to exploring the new BI appliance and Big Data markets has been impressive IMHO. With acquisitions of EDW and RDBMS company, Sybase, in 2010 after earlier acquisition of BI leader, Business Objects, in 2007 was necessary to be relevant in the race to providing an end-to-end data infrastructure story. This was; however, a period of “catch-up” or “late entry” to the race.

The beginning of its true exploration began with SAP Hana and now strategic partnership with Hadoop commercialization company, Hortonworks. The ability to rise ahead of Data Warehouse and database management system leaders will require defining a new Gartner quadrant – the Big Data quadrant.

SAP Product Positioning

SAP_Product_PositioningLets look back in time at SAP’s early positioning. We have the core ERP business, the new “business warehouse” business, and the soon to be launched Hana business. The SAP data warehouse equation is essentially = Business Objects + Sybase IQ + Hana. Positioning Hana, as with most data warehouse vendors, is a struggle since it can be positioned as a data mart within larger footprints, or as THE EDW database altogether in smaller accounts. One would think that with proper guidelines, this positioning would be straightforward. But there is more than database size, and complexity of queries, but a very challenging variable of customer organizational requirements and politics that play into platform choice. As shown above, you can tell that SAP struggled with simplifying its message for its sales teams early on.

SAP Hana – More than a BI Appliance

SAP released its first version of their in-memory platform, SAP HANA 1.0 SP02, to the market on June 21st 2011. It was (and is) based on an acquired technology from Transact In Memory, a company that had developed a memory-centric relational database positioned for “real-time acquisition and analysis of update-intensive stream workloads such as sensor data streams in manufacturing, intelligence and defense; market data streams in financial services; call detail record streams in Telco; and item-level RFID tracking.” Sound familiar to our Big Data use-cases today?

As with most BI appliances back then, customers spent about $150k for a basic 1TB configuration (SAP partnered with Dell) for the hardware only – add software and installation services and we were looking at $300K, minimally, as the entry point. SAP started off with either a BI appliance (HANA 1.0) or a BW Data Warehouse appliance (HANA 1.0 SP03). Both of these using the SAP IMDB Database Technology (SAP HANA Database) as their underlying RDBMS.

BI Appliances come with analytics, of course

Hana_Analtics

When SAP first started marketing their Hana analytics, you were promised a suite of sophisticated analytics as part of their Predictive Analysis Library (PAL) which can be called directly in a “L wrapper” within an SQL Script. The inputs and outputs are all tables. PAL includes seven well known predictive analysis algorithms in several data mining algorithm categories:

  • Cluster analysis (K-means)
  • Classification analysis (C4.5 Decision Tree, K-nearest Neighbor, Multiple Linear Regression, ABC Classification)
  • Association analysis (Apriori)
  • Time Series (Moving Average)
  • Other (Weighted Score Table Calculation)

HANA’s main use case started with a focus around its installed base with a real-time in-memory data mart for analyzing data from SAP ERP systems. For example, profitability analysis (CO-PA) is one of the most commonly used capabilities within SAP ERP. The CO-PA Accelerator allows significantly faster processing of complex allocations and basically instantaneous ad hoc profitability queries. It belongs to accelerator-type usage scenarios in which SAP HANA becomes a secondary database for SAP products such as SAP ERP. This means SAP ERP data is replicated from SAP ERP into SAP HANA in real time for secondary storage.

BI Appliances are only as good as the application suite

Other use-cases for Hana include:

  • Profitability reporting and forecasting,
  • Retail merchandizing and supply-chain optimization,
  • Security and fraud detection,
  • Energy use monitoring and optimization, and,
  • Telecommunications network monitoring and optimization.

Applications developed on the platform include:

  • SAP COPA Accelerator
  • SAP Smart Meter Analytics
  • SAP Business Objects Strategic Workforce Planning
  • SAP SCM Sales and Operations Planning
  • SAP SCM Demand Signal Management

Most opportunities were initially “accelerators” with its in-memory performance improvements.

Aggregate real-time data sources

There are two main mechanisms that HANA supports for near-real-time data loads. First is the Sybase Replication Server (SRS), which works with SAP or non-SAP source systems running on Microsoft, IBM or Oracle databases. This was expected to be the most common mechanism for SAP data sources. There used to be some license challenges around replicating data out of Microsoft and Oracle databases, depending on how you license the database layer of SAP. I’ve been out of touch on whether these have been fully addressed.

SAP has a second choice of replication mechanism called System Landscape Transformation (SLT). SLT is also near-real-time and works from a trigger from within the SAP Business Suite products. This is both database-independent and pretty clever, because it allows for application-layer transformations and therefore greater flexibility than the SRS model. Note that SLT may only work with SAP source systems.

High-performance in-memory performance

HANA stores information in electronic memory, which is 50x faster (depending on how you calculate) than disk. HANA stores a copy on magnetic disk, in case of power failure or the like. In addition, most SAP systems have the database on one system and a calculation engine on another, and they pass information between them. With HANA, this all happens within the same machine.

 Why Hadoop?

SAP HANA is not a platform for loading, processing, and analyzing huge volumes – petabytes or more – of unstructured data, commonly referred to as big data. Therefore, HANA is not suited for social networking and social media data analytics. For such uses cases, enterprises are better off looking to open-source big-data approaches such as Apache Hadoop, or even MPP-based next generation data warehousing appliances like Pivotal Greenplum or similar.

SAP’s partnership with Hortonworks enables the ability to migrate data between HANA and Hadoop platforms. The basic idea is to treat Hadoop systems as an inexpensive repository of tier 2 and tier 3 data that can be, in turn, processed and analyzed at high speeds on the HANA platform. This is a typical design pattern between Hadoop and any BI appliance (SMP or MPP).

Screen Shot 2013-11-30 at 7.26.13 AM

SAP “Big Data White Space”?

Where do SAP customers need support? Where is the “Big Data White Space?”. SAP seems to think that persuading customers to run core ERP applications on HANA is all that matters. Are customer responding? Answer – not really.

Customers are saying they’re not planning to use it, with most of them citing high costs and a lack of clear benefit (aka use-case) behind their decision. Even analysts are advising against it - Forrester research said the HANA strategy is “understandable but not appealing”.

“If it’s about speeding up reporting of what’s just happened, I’ve got you, that’s all cool, but it’s not helping me process more widgets faster.”, SAP Customer.

SAP is betting its future on HANA + SaaS. However, what is working in SAP’s favor for the moment is the high level of commitment among existing (european) customers to on-premise software.

This is where the “white space” comes in. Bundling a core suite of well-designed business discovery services around the SAP solution-set will allow customers to feel like they are being listened to first, and sold technology second.

Understanding how to increase REVENUE with new greenfield applications around unstructured data that leverages the structured data from ERP systems can be a powerful opportunity. This means architecting a balance of historic “what happened”, real-time “what is currently happening”, and a combined “what will happen IF” all together into a single data symphony. Hana can be leveraged for more ad-hoc analytics on the combined historic and real-time data for business analysts to explore, rather than just be a report accelerator.

This will require:

  • Sophisticated business consulting services: to support uncovering the true revenue upside
  • Advanced data science services: to support building a new suite of algorithms on a combined real-time and historic analytics framework
  • Platform architecture services: to support the combination of open source ecosystem technologies with SAP legacy infrastructure

This isn’t rocket science. It just takes a focused tactical execution, leading with business cases first. The SAP-enabled Bid Data system can then be further optimized with cloud delivery as a cost reducer and time-to-value enhancer, along with a further focus around application development. Therefore, other white space includes:

  • Cloud delivery
  • Big Data application development

SAP must keep its traditional customers and SI partners (like CSC) engaged with “add-ons” to its core business applications with incentives for investing in HANA, while at the same time evolving its offerings for line of business buyers.

Some think that SAP can change the game by reaching/selling to marketers with new analytics offerings (e.g. see SAP & KXEN), enhanced mobile capabilities, ecosystem of start-ups, and a potential to incorporate its social/collaboration and e-commerce capabilities into one integrated offering for digital marketers and merchandisers.

Is a path to define a stronger CRM vision for marketers? It won’t be able to without credible SI partners who have experience with new media, digital agencies and specialty service providers who are defining the next wave of content- and data-driven campaigns and customer experiences.

Do you agree?

Posted in Data.

Tagged with , , , , , , , .


Infochimps, a CSC Company = Big Data Made Better

Hero-tile-template-announcement

What’s a $15B powerhouse in information technology (IT) and professional services doing with an open source based Big Data startup?

It starts with “Generation-OS”. We’re not talking about Gen-Y or Gen-Z. We’re talking Generation ‘Open Source’.

Massive disruption is occurring in information technology as businesses are building upon and around recent advances in analytics, cloud computing and storage, and an omni-channel experience across all connected devices. However, traditional paradigms in software development are not supporting the accelerating rate of change in mobile, web, and social experiences. This is where open source is fueling the most disruptive period in information technology since the move from the mainframe to client-server – Generation Open Source.

Infochimps = Open Standards based Big Data

Infochimps delivers Big Data systems with unprecedented speed, scale and flexibility to enterprise companies. (And when we say “enterprise companies,” we mean the Global 2000 – a market in which CSC has proven their success.) By joining forces with CSC, we together will deliver one of the most powerful analytic platforms to the enterprise in an unprecedented amount of time.

At the core of Infochimps’ DNA is our unique, open source-based Big Data and cloud expertise. Infochimps was founded by data scientists, cloud computing, and open source experts, who have built three critical analytic services required by virtually all next-generation enterprise applications: real-time data processing and analytics, batch analytics, and ad hoc analytics – all for actionable insights, and all powered by open-standards.

CSC = IT Delivery and Profession Services

When CSC begins to insert the Infochimps DNA into its global staff of 90,00 employees, focused on bringing Big Data to a broad enterprise customer base, powerful things are bound to happen. Infochimps Inc., with offices in both Austin, TX and Silicon Valley, becomes a wholly-owned subsidiary, reporting into CSC’s Big Data and Analytics business unit.

The Infochimps’ Big Data team and culture will remain intact, as CSC leverages our bold, nimble approach as a force multiplier in driving new client experiences and thought leadership. Infochimps will remain under its existing leadership, with a focus on continuous and collaborative innovation across CSC offerings.

I regularly coach F2K executives on the important topic of “splicing Big Data DNA” into their organizations. We now have the opportunity to practice what we’ve been preaching, by splicing the Infochimps DNA into the CSC organization, acting as a change agent, and ultimately accelerating CSC’s development of its data services platform.

Infochimps + CSC = Big Data Made Better

I laugh many times when we’re knocking on the doors of Fortune 100 CEOs.

“There’s a ‘monkey company’ at the door.”

The Big Data industry seems to be built on animal-based brands like the Hadoop Elephant. So to keep running with the animal theme, I’ve been asking C-levels the following question when they inquire about how to create their own Big Data expertise internally:

“If you want to create a creature that can breathe underwater and fly, would it be more feasible to insert the genes for gills into a seagull, or splice the genes for wings into a herring?”

In other words, do you insert Big Data DNA into the business savvy with simplified Big Data tools, or insert business DNA into your Big Data-savvy IT organization? In the case of CSC and Infochimps, I doubt that Mike Lawrie, CSC CEO, wants to be associated with either a seagull or a herring, but I do know he and his senior team is executing on a key strategy to become the thought leader in next-generation technology starting with Big Data and cloud.

Regardless of your preference for animals (chimpanzees, elephants, birds, or fish), the CSC and Infochimps combination speaks very well to CSC’s strategy for future growth with Big Data, cloud, and open source. Infochimps can now leverage CSC’s enterprise client base, industrialized sales and marketing, solutions development and production resources to scale our value proposition in the marketplace.

“Infochimps, a CSC company, is at the door.”

 Jim Kaskade
CEO
Infochimps, a CSC Company

Posted in Cloud, Data.

Tagged with , , , , .




Switch to our mobile site