cVidya Realizes Power of Hadoop

cVidya, the Israeli revenue assurance vendor, has announced their software will ‘support’ Hadoop, the open source technology for processing very large data sets across a distributed environment. You can read the press release here.

This is a very smart move by cVidya. cVidya’s core competence lies in skilfully presenting end users with conclusions drawn from data. Hadoop is a genuine Big Data technology which can greatly expand the volume of data which can be usefully accessed, whilst also lowering the cost of storing that data. By integrating with Hadoop, cVidya’s tools will sit atop a mature and powerful open source technology in order to deliver far superior results to their customers, without needing to make much change to their proprietary software. The press release states that cVidya’s software will be able to interface with a customer’s existing Hadoop solution. Alternatively, cVidya say they can provide customers with the hardware and software to implement Hadoop.

cVidya CEO Alon Aginsky is quoted as saying:

We see the distributed computing capabilities of Hadoop as a game changer for Fraud Management, Revenue Assurance and Marketing Analytics.

By making it cost-effective to analyze data from many different sources on a massive scale, we are essentially enabling digital and communications service providers to do more for less, thereby boosting profitability.

For once, talkRA can find no fault with the words of Aginsky. This development suggests cVidya’s management may finally be grappling with the strategic weaknesses that have seen them lagging competitors in recent years, after once claiming to have the leading market share for RA and FMS. In the past, cVidya CTO Gadi Solotorevsky advocated that business assurance departments should fight the rise of Big Data, taking a purist approach towards maintaining and controlling their own unique data repositories, even if this resulted in higher costs and duplication of effort within the business. By adopting Hadoop, it seems cVidya has realized there is a better way forward. Hadoop will see business assurance analysts working with the same data as used by the rest of the business. Those analysts will have much freer reign to tackle a broader scope of challenges. At the same time, cVidya’s software is more likely to attract the attention of telco staff working in other parts of the business. So this new, more open approach, is actually in cVidya’s best interests, despite their previous encouragement of empire-building within business assurance departments. Saying business assurance should be located in silos, surrounded by thick walls, was never going to be more than a tactic to increase sales, by providing those silos with data and technology that was separated from the rest of the business. It fell short of being a proper strategy, because it set endemic limits on what could be achieved. Those limits were not just felt by the departments that followed the advice, but ultimately they were felt by cVidya, as limits on what they could sell.

Big Data inevitably tears down walls within organizations, and opens up former silos. It looks as though cVidya has also come to that realization, and they will seek to harness the potential on behalf of their customers, and for their own business.

Eric Priezkalns
Eric Priezkalns
Eric is the Editor of Commsrisk. Look here for more about the history of Commsrisk and the role played by Eric.

Eric is also the Chief Executive of the Risk & Assurance Group (RAG), a global association of professionals working in risk management and business assurance for communications providers.

Previously Eric was Director of Risk Management for Qatar Telecom and he has worked with Cable & Wireless, T‑Mobile, Sky, Worldcom and other telcos. He was lead author of Revenue Assurance: Expert Opinions for Communications Providers, published by CRC Press. He is a qualified chartered accountant, with degrees in information systems, and in mathematics and philosophy.

9 Comments on "cVidya Realizes Power of Hadoop"

  1. Avatar Daniel Peter | 3 Apr 2014 at 9:47 am |

    Alon Aginsky is right that distributing computing capabilities will be a game changer for RAFM although it’s contradictory to their earlier advocacy that business assurance departments should fight the rise of Big Data. Looks like there is a realization that it’s better to be a part of the change than fighting against the change.
    Technologies based on Hadoop ecosystems are cost-effective therefore telcos can process huge volume at a lower cost. With this low cost processing power, RA can also measure the data warehouse to detect leakage, if cost case permits. Telcos are lowering their spending on RA therefore it’s important that the RA vendors also venture in to Analytics so that other departments can also leverage the investment in RA. I would say that most of the telcos will move towards hadoop ecosystems to leverage the cost advantage.

  2. Avatar r agarwal | 3 Apr 2014 at 12:05 pm |

    Daniel – While you are right theoretically, can you identify a few operators who have actually shifted to the so called big data world ? If you were to look @ technology RDBMS still rules the roost, a fact not many of us are not willing to give in. Massively parallel processing is something that can be done even by Oracle but of course it would require a lot of tuning by the technical doctors.
    Undoubtedly the winds of change are blowing, but frankly, its blowing slower than expected.
    I would put up the question in another way: To what extent is big data changing the traditional data analytics landscape? My answer is , just by a few basis points. To conclude, I would say that Hadoop is not the death of data warehousing.

  3. Avatar Jason Barrymore | 4 Apr 2014 at 3:10 pm |

    Eric –

    I guess CVidya has just woken up from a long slumber.
    Every single RA practitioner today speaks about Big Data and its associated benefits. Having used the Connectiva product in the past and of course had it replaced for reasons best known to the RA world, I casually passed thru their website and they too have launched a new big data platform or so. Their new company MARAISON or whatever it is, seems to have made some impressive improvements on their products. Whats more is that they have even struck a deal with Thuraya, the famous provider of satellite services. I must admit, thats some move considering that they were silent for a long period.
    To avoid digressing, big data is the new world technology. By merely adopting hadoop and other big data instruments, data scientists will be solve the puzzle and resort to providing more meaningful reports.
    A week ago, I had a few guys from SAS visit our labs to demonstrate a sophisticated prediction model, that runs on the bayesian principles. While we had shared chunks of data ( ofcourse modified ) , the propensity of this tool to generate results within a few hours was simply outstanding. They had just used very minimal appliances proving that the TCO isn’t as high as what the market claims is for SAS.
    My question to you Eric is , do you really think that it is worthy of the cost in investing on a tool like an RA or FMS when SAS provides me with a far more enhanced product that runs on more analytical and scientific calculations ?
    Not only does the opco get to use a world class tool but it also is provided with world class data scientists who can stand by in providing support during the hour of a crisis ?
    Even if you believe SAS may prove to be expensive which ofcourse I think otherwise, MS Excel is fitted these days with excellent analytical instruments.
    Do share your thoughts Eric.

    Principal Scientist – O2 Gmbh

    • @ Jason, thanks for sharing your insights. In answer to your specific question, I’m going to admit that I’m too ignorant to give a robust answer. This is a question about the real world – how much things cost vs. the benefits they deliver vs. the costs and benefits of using alternate tools. A good answer should be based on solid data, and I don’t have enough current data to form a reliable opinion. But I do take your point and I recognize the trend you’ve identified. Whether we’re talking about SAS at the high end of the market, or MS Excel at the low end of the market, the increasing power of these technologies should make it easier to use them for servicing the relatively narrowly-defined requirements of revenue assurance and fraud management. That then begs a serious question about whether it is better to invest in ‘best of breed’ tools designed for the specific needs for dedicated RA and Fraud teams, or whether telcos would be better off maximizing the value they obtain from technologies which are used more broadly across the business.

      I first talked about the ‘new paradigm’ for revenue assurance in 2008, and the crux of that presentation was that evolving technology and market forces would prompt some telcos to rethink how they approach the challenge now called ‘business assurance’. Instead of tasking dedicated teams to pursue specific recurring goals that change little, like the detection and recovery of certain kinds of revenue leakage, they would see advantages in employing internal consultants, with a higher skill set, to utilize more powerful tools with access to much more extensive data. To get the best value, and to justify the increased pay of these skilled internal consultants, they would be granted an open remit to analyse ways to improve the business. I think this fits very well with your conception of world class data scientists using SAS to perform analytical and scientific calculations. Of course there is a sliding scale between this world class vision and the classic conception of a ‘data integrity paradigm’ for revenue assurance, where staff are responsible for a limited number of reconciliations, and where the emphasis is on reproducing those reconciliations for large volumes of transaction data on a daily basis. To use a crude analogy, the difference between the paradigms is akin to the choice between employing a train driver to very efficiently execute pre-defined journeys according to a regular schedule, or to employ a driver of a car which can go off-road, as well as across highways, thus giving him freedom to explore territory as he sees fit.

      Whilst technology is increasing the opportunity to exploit data, the ‘new paradigm’ may not be appropriate for all telcos. The current organizational structure, the human resources available, and the corporate culture will also influence the approach that is taken in each telco. To continue the analogy above, whilst some telcos may be able to import off-road cars, they may prefer to employ train drivers who will stick to a timetable dictated by their bosses.

      I have an obvious bias, in that I want practitioners to develop career paths which see greater rewards in exchange for utilizing more advanced skills and covering a more extensive scope. But I have found that even best of breed software developers need not be opponents to my goals. The smarter vendors recognize that they have always been at risk of outside businesses taking an increased slice of the RA and FMS market. And they know they need to extend the range of their products to avoid stagnation. This is why firms like WeDo started talking about the vaguer and more extensive concept of ‘business assurance’ in preference the well-defined subsets of revenue assurance and fraud management. Staff can also do more to push for more responsibility within their businesses, and vendors will respond to the demands made by their customers.

      I’ve given my thoughts, for what they’re worth. Now I wonder if I might persuade you to elaborate on your experiences as a data scientist, and with using SAS. Reading between the lines, it is possible you work in a culture which affords you some possibilities that other practitioners dare not ask from their employers. And yet, one of the best ways to facilitate change is to share examples of how work can be done differently, so others can learn from the example, and copy it. Would you be willing to write a guest blog?

  4. Avatar Daniel Peter | 7 Apr 2014 at 10:10 am |

    R Agarwal:
    We have to look at Hadoop ecosystem as an evolution to the traditional RDBMS specialization. I’m not sure sure how-much fine tuning is required to do MPP in Oracle but the experts of Oracle I’m acquainted to are not very positive about it, and even if it’s achieved, is it worth the license investment is another question to be answered.
    You are right that the acceptance level of “Big Data” is lower than expected but the change is happening. Telcos have started requesting the RA vendors to do PoC and you would see a lot telcos adapting to Big Data technology in the near future. There are obstacles to implement big data such as –regulatory issues in certain countries not allowing telcos move data outside their premise / country, this becomes a hurdle for deploying the system in cloud; another important issue is the knowledge on NoSQL is low hence there is big challenge to make the SQL experts learn new query languages, but the smarter technology provides drag and drop capability therefore one does not have to become an expert in NoSQL to use Big data technology
    Real-time monitoring is possible now with Big Data therefore telcos not have to wait for D-2 or D-1 for viewing KPI/reports/dashboard, also instant fraud alerts

  5. Eric,

    As a long time reader, first time contributor, I was waiting for your immediate response on the possibilities of SAS being used in the revenue assurance space. Not that I pretend to know their current capabilities but surely past association with your GRAPA nemesis, Rob Mattison, would have been enough to question what their offering might actually constitute.

    The links below show a history of Mattison and SAS working together over an extended period. I acknowledge there is nothing more current than what my limited google research could uncover so would be interesting to understand if and why the relationship terminated. Anyone know that?

    2005 –

    2006 –

    2007 –

    2008 –

    • @ Harry H, it pains me to admit it, but Papa Rob Mattison wasn’t wrong about everything, and I shouldn’t suggest he is. My criticisms of the Grand Poobah began when he chose to set up GRAPA as a rival to the TMF’s RA working group. He could have joined the pre-existing TMF group, instead of just ignoring/copying any work that went before GRAPA, and instead of appointing himself Big Chief of a phoney ‘global association’. If he had joined the TMF group, it would have been more balanced, and would have benefitted from his involvement. Maybe, like me, Mattison would have been frustrated and then decide to leave, but Mattison didn’t even try to work with any of his peers. Crucially, Mattison’s pro-BI views would have been a counterweight to the excessive influence of cVidya and Gadi Solotorevksy, who adopted a ridiculous anti-BI stance for painfully obvious reasons. (Consider the few telcos that actually engage with the TMF RA group? How many of them are customers of cVidya? How many prefer to use BI or other generic tools? This explains how Solotorevsky has biased the work of the TMF.) Solotorevksy has effectively polluted the eTOM with a model for doing RA that makes it seem like RA can only be performed with a dedicated COTS system. Unfortunately, Mattison chose to compete with the TMF without first attempting to collaborate with it. And so, his criticism that the TMF is too dominated by software vendors, whilst valid, also reflects his own arrogance.

      And, oddly enough, if Mattison had joined the TMF team, it would have been easier for me to work with both of them. Both Mattison and Solotorevksy present themselves as widely-loved moderates who champion the ordinary RA employee. In reality, they’re both extremists. That’s why they each pretend that the other one doesn’t exist. They both want to dictate how the job should be done, setting limits on the way RA practitioners tackle problems. Put them in a room together, and I’d come across as a moderate, because I’m open-minded about the pros and cons of using generic data tools versus dedicated RA COTS software, and about all the other ways that RA practitioners can solve problems. I’m only immoderate in the sense of being critical of extremists.

      If I’m honest, I don’t know enough about using SAS to form an opinion about its usefulness for business assurance goals. If they thought they were being clever by getting Mattison to pitch their product, than I pity their foolishness – but there’s been plenty of other chump businesses who thought Mattison would boost their sales. It turns out they were wrong, and that may explain why Papa Rob doesn’t get sales gigs like he used to, despite calling himself the Great Wizard of RA. But the foolishness of some marketeers doesn’t mean that SAS isn’t a decent product, or that it can’t be used for assurance. That’s why I’d like to hear more from people who actually use SAS to satisfy assurance objectives – people who really use it, as opposed to a biased blowhard like Papa Rob, or the tediously self-serving Solotorevsky.

  6. There is a lot of misconception of what big data actually means to the Telco. The focus has been disproportionate on Volume, Velocity and Variety of while the Value from data is still very fuzzy at the moment. This get relected in the lack of implementable use cases for big data.

    The biggest use case for Hadoop is as a low cost staging area to transform and load data, beyond that widely acceptable use cases become difficult to find.

    The biggest issue i feel is the ability of widely available statistical packages inluding SAS to enable multiple users to simultaneously and interactively manage, explore and analyze data, build and compare models, and score massive amounts of data in Hadoop. While there has been massive interest and investments in this area with vendors releasing solutions or adding functionalities, i guess the capabilities are still not what we have with traditional analytics.

    Bringing the data out of Hadoop and into a traditional analytical sandbox simply defeats the advantages one would gain from a big data environment. At this point the ability to port analytical logic to a hadoop environment is the biggest challenge I see in Telcos using Hadoop.

    • @Nikhil, that’s a great comment and I can see the truth in what you write about misconceptions. I admit that I’m sometimes fuzzy about the meaning of Big Data to telcos, and I’ve detected fuzziness in others too. The industry needs leaders to develop the concepts for how to extract value from Big Data, and then share their thoughts and examples with others. Any such leaders would be welcome to share their views via talkRA!

      I’m also encouraging the UK RAG to plan for a focussed session on using Big Data for the purpose of telecoms assurance. We’re not ready to hold that session yet – it will probably be run in the Autumn. But I’m keen to reach out to possible speakers (either attending our London meeting in person, or conferencing in) who work for telcos and can present case studies about really utilizing Big Data, or who work for tech suppliers and can talk about the technology and the opportunities to use it. In short, I can’t answer the question as to what value can be delivered via Big Data, but I’d love to hear from people who might know part of the answer.

Comments are closed.