Blog of Random Thoughts and Pictures

A User Centric Always Best Connected Service Business Model for MVNOs

October 15th, 2010

As announced on the Perimeter blog, on October 14th last Anwesh got to present our joint paper on “A User Centric Always Best Connected Service Business Model for MVNOs” during the Mobile Connectivity Platforms session of the Business Models for Mobile Platforms (BMMP 10) workshop.
We’ve taken a good hard look at the current state of the Mobile Virtual Network Operator (MVNO) business model, which is mainly based on the concept of reselling minutes (cheaply), and really it seems the long term viability of the MVNO is unclear from the existing telecom industry structure.
But we believe there is a big opportunity for MVNOs in the way they can offer IP-based communication services and we think the model has to change from Value Chain to Value Network.
In our view the vertical integration world of the telecom industry will be unable to satisfy the new range of value added services (VAS) and we’re sure that more collaborative business models based on core competencies are likely to emerge. We think that next generation mobile services will leverage robust access platforms, with the emergence of a dedicated service composer.
Given this view we think MVNOs are in a better position to deliver innovative VAS as more tightly integrated partners with equipment vendors and through our paper we introduce a new MVNO as a Broker (MVNOB) model.
Take a closer look at the slides as we provide an analyses of the characteristics of our conceptual MVNOB model.

The presentation raised some interesting comments and questions from the audience as it was seen to be disruptive to the present mobile communications business model. Most of the presentations at the conference and workshops were heavily in favour of mobile operators while we presented the opportunity for virtual operators in the changing telecommunications landscape.
Some of the questions/comments from the audience which we need to consider are:

  • Why is MVNOB not a threat to the MNO?
  • What will happen will MVNOB start creating services themselves to satisfy a unique opportunity in the market?
  • Nokia-Siemens Networks (equipments vendors) are good candidates for MVNE.
  • Is this something theoretical or is it something real happening in process?
  • What is the technology enabler?
  • What about information flow with regards for missed call, unavailability for example, among various operators and MVNOB?
  • How tightly or deeply will the MVNOB be integrated with the MNO?
  • Radio spectrum sensing issues – Broker platform can lead to power shift from the operators to broker and is highly disruptive to the present industry structure.

I must admit I’ll have to take a little to time to address each one of these questions and I wonder do you have any other questions to add?
Update: The paper is now avaialble on IEEE Xplore “A user centric always best connected service business model for MVNOs

FP6 has MORE than come to a close

September 17th, 2009

My first encounter of the FP6 IST programme was in Dublin Castle July 12th 2002 at its Irish launch event in Dublin castle. I remember the day well funnily enough as the there were unexpected roadworks in South Kilkenny that morning so once I made it (late) to the conference room in the Castle, it was packed with people and I got moved into one of those language translation booths, which was great, I had a higher viewing vantage point, a table and a very comfortable chair!
Well 7 years, and eight FP6 projects later and the FP6 IST programme has come to a final chapter for me, as the IST MORE project is now technically complete.

Although in fairness my involvement in the research and developments of IST MORE was peripheral as really Chris, Gemma, Chen, Kristian, Niall D., and a whole host of others helped bring the project from a grand vision for a “Network-centric Middleware for GrOup communication and Resource Sharing across Heterogeneous Embedded Systems” to a neatly designed software based middleware that hides the complexity of the underlying heterogeneity of embedded systems and provided a MORE simplified API and management mechanism.

And to prove the it is neat the MORE middleware helped integrate the management of a medical process in Hungry (for doctors and patients). It helped create a virtual organisation, allowing for chronic patients to be monitored continuously by sensors and accessed via mobile devices. The middleware in turn allowed for easy access to an on-line service for the doctors, diabetes patients and patients family to react to emergency situations, but of most benefit it was found that the implement system significantly decreased the number of necessary personal encounters between the doctor and the patient.
Watch the video below for further details

The project, just like any other framework programme project has tons of deliverables, but the one I want to point you towards is the document D5.1 Test Protocol [pdf], which details the building blocks for the MORE testing framework giving third-party developers a frame to test for correctness and compatibility. It also explicitly shows the test bed infrastructure for all the end user scenarios tested during the project, which is broken down into the laboratory environment test bed and the live field testing environment. The laboratory environment is complemented by an intensive test bed for performance evaluation, consisting of real world testing as well as simulation.
Finally the project source is available for your viewing, either head over to the MORESS SourceForge page or just use svn directly

svn co mores

Digitising health records is it really going to be helpful?

September 6th, 2009

I hear again and again all the positives about eHealthcare, it’s seems to be the only way to go, which is why I’ve found this OA paper asking a very interesting question “Do Electronic Health Records Help or Hinder Medical Education?” and I wonder in the same way will electronic health records help or hinder (my) medical anaylsis in the future?
Photo Credit: JasonRogersFooDogG iraffeBee's photostream on flickr
It is becoming clear that hospitals are implementing, and in some ways are being forced to implement massive electronic health record (EHR) systems, but in this implementation are they considering the end user …. sorry I should say patients needs, wants and cares in its implementation? And what about the people entrying the data from admin staff to nurses to doctors, are their needs being considered?
On similar massive projects it simply hasn’t been the case and I do wonder!
So to the powers that be, please note the authors conclusion when it comes to the educational side of using EHR

that the mere presence of the EHR will not improve practice quality, and will not make education better or more efficient ………
………. if the EHR is used as a tool rather than an end unto itself, it will improve our education of young physicians as well as the care of our patients.

Interactive data visualisation

July 19th, 2009

While compiling my last entry on the map of science it did make me think that in this day and age data visualisation in the ICT world should be more interactive, like this visualisation of the Linux kernel.
What caught my eye was this work by Tony Hirst on visualising the lap time data from Australian F1 grand-prix in 2009 using ManyEyes, which has led onto some very interesting social commentary on the visualisation of UK MP’s expenses.
But is all these cases of visualisation research an after the fact activity with steady data sets and results.
Copyright of UC Regents 2009
But I wonder can macro architectural network patterns married with micro network component specifications and fused in a data visualisation tool, be a way to address future Internet design, pre-deployment?
And to make this happen, what exact “steady” data would I need to realise such a wonder?

Map of Science

July 12th, 2009

I’ve always had an interest in data visualisation, one of my most viewed blog entries is on a data visualisation of the relationships between different scientific disciplines, which is currently framed and hanging on my home office wall (the only one!), so this recently published map of a journal network that outlines the relationships between various scientific domains has had me interested again.
This time the data visualisation is based on the collection

of nearly 1 billion user interactions recorded by the scholarly web portals of some of the most significant publishers, aggregators and institutional consortia. The resulting reference data set covers a significant part of world-wide use of scholarly web portals in 2006, and provides a balanced coverage of the humanities, social sciences, and natural sciences. A journal clickstream model, i.e. a first-order Markov chain, was extracted from the sequences of user interactions in the logs. The clickstream model was validated by comparing it to the Getty Research Institute’s Architecture and Art Thesaurus. The resulting model was visualized as a journal network that outlines the relationships between various scientific domains ….

and is full recorded in a paper by Johan Bollen, Herbert Van de Sompel, Aric Hagberg, Luis Bettencourt, Ryan Chute, Marko A. Rodriguez, Lyudmila Balakireva, “Clickstream Data Yields High-Resolution Maps of Science”
What results is a map that

represents the structure of scholarly activity from an observational perspective, not from a prescriptive or motivational one. User interactions with scholarly web portals are shaped by many constraints, including citation links, search engine results, and user interface features. In this paper we do not attempt to explain or motivate these interactions, but merely to demonstrate how their overall structure can be charted and described from clickstream maps of science.

Watch out the image is large
The PLoS site related to this paper has some interesting comments and the related article from the NY Times: Map of Knowledge offers some further insight from the authors.

The pencil so simple, so ubiquitous and so much history

June 28th, 2009

I recently finished an intriguing book with the history of the pencil … yes the pencil.

Maybe it doesn’t sound too exciting but really the story touches on the pencil as it emerges as a new writing technology, to the over-mining of plumbago and over-cutting of cedar trees its base components.
The need for research, development and innovation in the creation of new writing lead, the centuries of secrecy around that combination of graphite & clay mixture in the lead (Conte).
How the industrial revolution created a situation were there was 10 pencils for everyone on earth, the subsequent price fixing, international trade wars, standardisation (of lead grading), regulations and industrial consolidation.
To the threat of the mechanical pencil and ink pens, then typewriters, computers and many others and yet 4 centuries later I look at my desk at work and see 5 pencils, I’ve no idea were they have come from, who made them or how, but I know why they are there, I’ll continue to use them for scribbling transient notes and now at least I have a little more insight on the pencils history.

The greatest scientific impact from Ireland in the past 5 years is in…..

June 21st, 2009

…. Agricultural Sciences.
Photo Credit NZMonkey on Flickr
And that’s according to Thomson Reuters National Science Indicators, 1981-2007.
This is were Thomson Reuters have taken

Ireland’s world share of science and social-science papers over a recent five-year period, expressed as a percentage of papers in each of 21 fields in the Thomson Reuters database.

Ireland’s citation impact compared to the world average in each field, is also highlighted were

Ireland exceeded the world average by 15% (3.38 citations per paper for Ireland versus a world mark of 2.93 citations) [in Agricultural Sciences]. Ireland also scored well in relative impact in immunology (26% above the world mark), physics (23% above), materials science (+22%), and chemistry (+15%).

Looks like Ireland will have to pull its socks up when it comes to Computer Science and Mathematics and when it comes to Economics & Business, well it looks like this report came too late!
But when I look at this topic of impact factors, citations and the h-index a little closer, things are not so clear cut, to a point of being fairly questionable.
Which has lead me to this very interesting paper by Allen L, Jones C, Dolby K, Lynn D, Walport M (2009) Looking for Landmarks: The Role of Expert Review and Bibliometric Analysis in Evaluating Scientific Publication Outputs. PLoS ONE 4(6): e5910. were the authors were looking

To compare expert assessment with bibliometric indicators as tools to assess the quality and importance of scientific research papers.

And they found that

When attempting to assess the quality and importance of research papers, we found that sole reliance on bibliometric indicators would have led us to miss papers containing important results as judged by expert review. In particular, some papers that were highly rated by experts were not highly cited during the first three years after publication. Tools that link expert peer reviews of research paper quality and importance to more quantitative indicators, such as citation analysis would be valuable additions to the field of research assessment and evaluation.

After all of that I’m left wondering, have you got the h-Factor?

There are no communication research topics in J, Y or Z

June 6th, 2009

At the start of each year (usually the January issue) I have a little look at the full subject index page for the previous years articles in the IEEE Communcations magazine, just in case I missed an article I was interested in, and something pops out at me. In carrying out this task recently I noticed for 2008 there were no communication research topics under the alphabet heading of J, Y or Z. So I’ve checked the 2007 index and the same again, no topics under J, Y or Z.
Shouldn’t there be a paper on “Jitterless yobibyte service bus for a ZSL” or “Zoning of jumbo frame networks: Yobibyting more than you can chew”. Hey I might trademark that last one!
Photo Credit: on Flickr

Scientific bibliography reference management software

April 19th, 2009

A great question appeared on the TSSG mailing list the other day, a recommendation for reference management software.
Photo Credit: freelina2 on flickr
I’ve started to use Connotea and with the handy bookmarklet option for my browser I have found it really easy to add papers and there are tons of export options to re-store the reference material.

Can paying for research content multiple times be the right way to go about it?

March 22nd, 2009

Okay its a drum beat gathering some pace on this blog but as an update to my most recent thoughts on Open Access there has been some considered input captured by Bora Zivkovic on that same paper and on the fact that Open Access in the developing world – yes, it is a Good.Thing..
Photo Credit Open Access (storefront) by by Gideon Burton (Flickr)
I’ve also noticed that this very issue of paid research journal content v’s open access is taking a very interesting twist in the US, and has some way to run its course, but if I was a betting man ………
Finally this entry cannot pass without a glance at the Irish Innovation Monster
Photo Credit Cookie Monster! by Hayley_Bouchard (Flickr)