Blog of Random Thoughts and Pictures

Bridging MQTT brokers and using security certs from Let’s Encrypt

October 14th, 2019

This is an item that came up while working on a project within the TSSG and so might be worth sharing.

Have you ever tried to use a MQTT broker ? Message Queuing Telemetry Transport (MQTT) is a machine-to-machine (M2M), Internet of Things data protocol, which is in line with other data protocols such as XMPP, CoAP, AMQP, and Websockets. Invented in 1999, MQTT is now an OASIS (Organization for the Advancement of Structured Information Standards) standard, and ISO standard (ISO/IEC PRF 20922).

MQTT is extensively used in Amazon Web Services, Microsoft Azure IoT Hub, IBM WebSphere MQ, and is a publish/subscribe message exchange pattern, that can support persistent message storage on the broker and supports security in the form of authentication using user name and password, and encryption using SSL/TLS.

For something like the Eclipse Mosquitto broker, MQTT it has a really small code footprint, the libmosquitto (client library) is about 1.3 MB and is ideal if processor or memory resources are limited and also ideal if bandwidth is low or network is unreliable. Classic problems in the IoT space.

In the case of using MQTT for the smart grid, scale and security are top priorities. To achieve scale I’ve looked at bridging MQTT brokers in a hub and spoke model, where a very light MQTT broker is at the edge of the network (at the end of the spoke) and there’s a large MQTT broker at the hub which can aggregate all the data.

However the purpose of this post is to highlight the security aspects within MQTT and in particular the application of encryption (SSL/TLS) when using Let’s Encrypt certificates. Applying a certificate to an MQTT broker is not too hard, there’s a nice guide here on Mosquitto SSL Configuration for MQTT TLS Security and here too on SSL/TLS Client Certs to Secure MQTT however in the vast majority of cases the examples use self-signed certs and not certs as provided by Let’s Encrypt.

By the way if you don’t know Let’s Encrypt is a non-profit certificate authority run by the Internet Security Research Group that provides X.509 certificates for Transport Layer Security encryption at no charge. The certificate is valid for 90 days and the renewal process is quite simple.

Now bridging two MQTT brokers can be relatively straight forward too however getting the certs right when you want that bridge to be encrypted can be a little tricky look at how much you have to do to bridge a Mosquitto MQTT Broker to AWS IoT.

In my case I wanted to bridge two Mosquitto MQTT Brokers, each with encryption enabled by a Let’s Encrypt cert. Firstly I created a [special Docker container]() that could pick up the Let’s Encrypt cert, and having followed all the guides I kept getting the following error in the logs

OpenSSL Error: error:14037418:SSL routines:ACCEPT_SR_KEY_EXCH:tlsv1 alert unknown ca

OpenSSL Error: error:140370E5:SSL routines:ACCEPT_SR_KEY_EXCH:ssl handshake failure

Socket error on client , disconnecting.

I tried verifying the certs, by installing openssl

openssl verify cert.pem

But all was fine.

I thought I had to download the trusted root CA certificates for Let’s Encrypt and place it somewhere in the Alpine linux system (the base OS of the broker), but I must admit this “somewhere” was not so clear me.

The problem is that the MQTT broker does not know how to verify its own CA before starting the ssl exchange with any client. This is because the CA signing the Let’s Encrypt cert is not yet distributed and bundled by default in to the Alpine Linux system and therefore has to be added manually.

In the Mosquitto MQTT broker configuration, instead of just pointing directly at the chain.pem file I decided to point at the default place where all ca certs should be.

#cafile /mosquitto/config/certs/chain.pem
capath /etc/ssl/certs

And this write up on installing certificates in an Alpine Image to establish Secured Communication (SSL/TLS) really got to the heart of the matter, the cert needs to be copied to a special directory /usr/local/share/ca-certificates/and then you need to run the program update-ca-certificates so it gets placed in the right way into the folder /etc/ssl/certs.

After much head scratching, it all comes down to 2 command lines

cp /mosquitto/config/certs/chain.pem /usr/local/share/ca-certificates/chain.pem

update-ca-certificates

Once done (via a docker-entrypoint.sh command) the container is able to handle the CA issue, and bridging 2 Mosquitto MQTT brokers that are using Let’s Encrypt certificates can be achieved.

Presenting the ICT PROSE project at Open World Forum

November 1st, 2012

I had the pleasure of attending the Open World Forum recently, where I got to represent a new FP7 project on open source called ICT PROSE.

Open World Forum

OWF itself really opened up my eyes to the activities happening around Europe in regards to open source in the enterprise. An overview of what I got up to is on the TSSG review of OWF12, and the rest of this post is a cross post of what I’ve written with Roberto Galoppini on behalf of the ICT PROSE project.

==================

Can projects and organisations keep full control of their data in open source forges? This was one of the key questions asked during the recent Open Forges Summit, part of the Open World Forum 2012*, held in Paris. With Roberto Galoppini (Geeknet) as Track Chair, and Miguel Ponce de Leon (TSSG) presenting, the PROSE project had some insights to present on the matter.

As part of the summit Roberto introduced participants to the changed landscape for source forges.

From there Ross Gardler of the Apache Software Foundation highighted how forges, today, don’t make it easy to discover the individuals and the communities behind the software and he made some suggestions (around the humble honey bee) on how forges could improve the way forge users could discover the important people and communities behind open source projects.

Scott Wilson of OSS Watch showed how its possible to bridge the gap between open source development processes and app stores, particularly in the case for mobile apps – but he pondered the question on how this could be applied to other kinds of software.

Stijin Goedertier of the ADMS Working Group outlined the future plans for the AMDS.SW metadata vocabulary which is used by JoinUp to describe open source software in the forge, making it possible to more easily explore, find, and link open source software on the web.

Olivier Berger & Christian Bayle of FusionForge did a integrated presentation on the advances of interoperability of FLOSS forges from the COCLICO projects.

Miguel then shared the goals of the ICT PROSE project. Through the presentation “Empowering FLOSS in European Projects” Miguel informed the audience of the PROSE project whose objective is to accelerate the adoption of open source software on EU ICT projects. The presentation highlighted the projects plans to increase the lifetime of the software developed inside European projects and thus maximizing projects’ impacts. The presentation showed the creation and management of a platform for FLOSS project management, the development of a training program on legal and business aspects pertaining to FLOSS adoption and provided insight on a dissemination program to promote the adoption of a FLOSS-driven model in EU ICT projects.

Finally the summit concluded with Laurent Charles on behalf of Enalean, highlighting how faster innovation was achieved by them with the Tuleap forge, and how customers quickly understood the gains: more contributions, exchanges, quality developments that really match their needs while staying free and independent.

Clearly there are new opportunities on how to allow projects to keep full control over their data in open source forges and new initiatives that the EU is driving have started to address the issues.

The Open World Forum is the leading global summit meeting bringing together decision-makers, communities and developers to cross-fertilize open technological, economic and social initiatives, in order to build the digital future. The event was founded in 2008, and now takes place every year in Paris, with over 180 speakers from 40 countries, an international audience of 1,900 delegates in 2011.

Irish Consultation on the next EU research funding programme

March 30th, 2011

This has been cross posted to my TSSG blog.

This was a short 1/2 day workshop I was recently invited to participate in. The Framework Programmes (FP7 and everything before) have to date, been the European Union’s chief instrument for funding research. Preparations for the next programme 2014-2020 (now called Horizon 2020) and the new Common Strategic Framework for Research and Innovation are now underway and the priorities are being discussed at national and European levels. Ireland has the opportunity to influence the direction and balance of European research and so the purpose of this workshop was to provide input to Ireland’s national submission to the European Commission in response to its Green Paper “From Challenges to Opportunities: Towards a Common Strategic Framework for EU Research and Innovation Funding [pdf]”.

This European wide consultation began in February of 2011, with its purpose to collect opinion on the future of research and innovation funding and co-operation into the next decade in Europe.

The paper itself asks (27) questions about how future funding systems might improve on previous ones, whether new mechanisms are needed and how the elements of the funding system should be balanced, which would have a direct effect on the funding allocated to certain schemes.

This Irish national consultation was led by the Advisory Council for Science, Technology and Innovation (ACSTI) and the drive of the workshop was to refine and develop the views gathered so far from the research and innovation communities in Ireland on the questions put in the European Commission’s Green Paper.

The workshop was opened by the chair Professor Anita Maguire were upon the purpose, structure, key themes and issues for discussion was explained.

  • Benefits of being in the Framework Programmes;
  • Making research and innovation funding more attractive and easy to access for participants;
  • Public-private partnerships;
  • Training and exchange schemes;
  • How to best cover the whole innovation cycle;
  • How to strengthen industry participation;

Once this overview was given the room was split into groups and each group was given the task of commenting on a sub-set of the questions, I was in the group for questions 1 -7. While I though the majority of the responses were fine I was a little concerned with the responses to questions 1 and 2, it seems with group I was in thought so too.

We offered feedback, and in some small way I’m glad to see it was considered as the process is now complete and the final Irish submission to the green paper can be read off this link [pdf] changes to questions 1 and 2 afoot.

In fact there were 13 responses from Ireland, Chemical and Physical Sciences Committee of the Royal Irish Academy, Electricity Supply Board, Forfas, Health Research Board, Irish Research Staff Association, Irish Universities Association, Marine Institute, National Committee for Geographical Sciences, Royal Irish Academy, Science Foundation Ireland, University College Dublin, and one from our very own Jim Clarke, Waterford Institute of Technology for which I also offered some input.

While the process can seem long winded, in fact all the opportunities are there to have your spoke in the programme, you just have to take the time to source those opportunities well in advance.

15th Meeting of the COST Domain Committee for ICT

March 4th, 2011

This has been cross posted to my TSSG blog

Some early morning fog in Brussels had me held up a little on arrival but once we touched down it turned out to be a glourious spring day in Belgium. I’m was here in Brussels for the COST ICT domain committee meeting and hearings, where we are listening to and deciding on some new COST actions in ICT.

We are also here to discuss the COST ICT Domain budget update, full proposal selection process and the proposal ranking algorithm, and had an overview of the outlier tool.

There were also some changes in final event handling (as of Nov. 2010) and some e-COST updates that needed to be discussed and of course the monitoring of ICT Actions in progress.

We went through the evaluation of completed and ending actions which included:

2010 Completed Actions

With the action ended the rappatours were giving an overview of some of the final news items from the Actions.


Action 2100 was highlighting its joint workshop on Wireless Communications, 1 – 2 March 2011, Paris, France. JNCW 2011 was organised jointly by the European Network of Excel- lence in Wireless Communications (NEWCOM++) and the European Cooperation Action on Pervasive Mobile and Ambient Wireless Communications (COST 2100).


Action 2101 was highlighting its Biometric ID Management Workshop ( BioID 2011 ) which was the Third International Workshop organised by COST Action 2101. The BioID 2011 will be held in the city of Brandenburg in the Brandenburg University of Applied Sciences.


Action 2102 was higlighting a publication announcement of the Proceedings of the PINK COST 2102 INTERNATIONAL Conference on “Analysis of Verbal and Non Verbal Communication and Enactment: The Processing Issues”, in LNCS and also its Third International Training School notes from March 15-19, 2010.

7th concerntation meeting of Future Networks

February 11th, 2011

This entry is cross posted from my TSSG blog.

Okay time is not being kind to me especially when it comes to completing entries for this blog and while February 2011 is already a lifetime away, but given that I was in Brussels directly after the FIRE workshop, I’d like to report on my attendance at the 7th concerntation meeting of Future Networks.

The main part of the plenary was given over to description of Future Networks research towards standardisation activities. The last part of the session was given over to future research topics in the area as identified by Net!Works, ISI, EIFFEL, NEWCOMM++, BINE and EURO-NF. All presentations can be seen off this link.

The second day of this meeting was split into a number of separate plenaries as the Network of the Future projects are organised into three clusters: Future Internet Technologies (FI Cluster), Radio Access and Spectrum (RAS Cluster) and Converged and Optical Networks (CaON Cluster). I attended the FI Cluster, the agenda and presentations of which you can see off of this link.

There were a number of presentation on the economic and user perspective of Inter-ISP traffic optimization, where ETICS, IBBT, SESERV and SMOOTH-IT made presentations on the matter.

I was quite interested in the session on Information and Execution Automation between the Service and Network planes where GEYSERS, MEDIEVAL, ONE, ONEFIT and  UNIVERSELF gave their view points, however I was left a little perplexed that there was no real concensus on the topic and no plan to reach one.

Okay only a few words it really shouldn’t have taken me this long to post it
, but I hope this gives you a quick overview of the EU activities in the area of the Future Internet, with the next big event FIA Budapest in May.

SFI Future Internet workshop

February 8th, 2011

This entry is cross posted from my TSSG blog.

I have found it hard to keep a handle on all the Internet based research happening in Ireland so I jumped at the chance to participate in the recent SFI workshop on the Future Internet. It proved to be a fantastic opportunity to catch up with old aquaintancies and to meet some new researchers in the field.

Now the topic line is a little controversial in that the term Future Internet now means many things to many people however what’s good about the term is that it can act as a nice umbrella term to capture the massive shift in Internet research which is looking for new ways to move, share, find, define and create digital information. Whether this information is for use in Education, Health , Finance Marine or even Agricultural services it was great to see the wealth of situations to which Irish research was being applied to.

To set the context for day early presentations in the workshop highlighted the meaning of the future Internet, the new architectures being discussed at the EU level and some perspectives on the European Future Internet Assembly.

I gave an overview presentaion on this topic, and my slides can be seen here.

Then in ernest a volley of 10 minute presentations were given by

  • Willie Donnelly on “Why the Future Internet?”
  • Stefan Decker on “From Linked Data to Networked Knowledge” and “Real-World Internet (FIA)”.
  • John Kennedy on the “Future Internet – An Intel Perspective”.
  • Pol Mac Aonghusa on “IBM Smart Cities”.
  • John Holland on the “Ericsson view”.
  • Keith Griffin on the “Cisco view”.
  • Fergal Ward on the “Intune view”.
  • Barry Smyth on “The Sensor Web”.
  • Ronan Farrell on the “CTVR Future Internet activities”.
  • Mike Hinchey on “Lero and FI”.
  • Barry O’Sullivan on “4C Future Internet activities”.
  • Steve Gotz on “CNGL and FI”.
  • Padraig Cunningham on “Clique and FI”.
  • Martin Johnsson on “FAME and FI”.
  • David Malone on th “Hamilton Institute Future Internet activities”.
  • Brendan Jennings on “FI Dagstuhl 2011”.

All the slides can be picked up off this SFI FI workshop page.

I found the format perfect 10 mins meant people had to get to point quickly while at the same time give an impression of the depth of research and it has to be said some very interesting solutions are being investigated in Irish research organisations.

Next up was a presentation by the Marine Institute with some use cases to which only Future Internet technologies could be applied. What followed was some discussion on other use cases which would be applicable, and I gave a presentation on upcoming FI PPP use case.

I usually find that the next steps for this type of event are left hanging in the wind however in this case it couldn’t be further from the truth as plans are now afoot for a broader workshop to take place in Q2 of 2011.

EU – Japan Symposium on Future Internet and New Generation Networks

October 22nd, 2010

This entry is cross posted to my TSSG blog.

Directly from finishing my open source session at the 6th Future Networks concertation meeting I headed for the Brussels airport to catch a flight to Tampere, Finland (via Stockholm) for the 3rd EU-Japan Symposium on Future Internet and New Generation Networks.
The flight was easy going, and the stop off in Stockholm was nice as I got to watch some Champions League football and then relax a little in the Starbucks cafe, catching up on some emails.

Tampere is the third largest city in Finland, and the scene for a number of technological innovations, I was told the first test GSM calls were made here. The actual hotel / conference location was set in a picturesque location by a lake.

Tampere outskirts

The event itself started with some high level presentations on EU Digital Policy, the Digital Agenda for Europe and the ICT Paradigm Shift in this decade. I found the presentation Masahiko Tominaga, Vice President, NICT on NwGN R&D Strategy [pdf] the most interesting of these.

On the next break, it was great to get the opportunity to share lunch with Sasi. Now I know Sasi normally only sits a couple of floors away from me, but it’s times like this we really get a chance to discuss at length a whole miriad of topics.

After lunch the event was broken up into sepereate Tracks and I headed for Internet/Network Architectures session. Sasi presented on the emerging generation of symbiotic networks: Federated Communication Systems [pdf] while I took the opportunity to present on RINA, the Recursive Inter Network Architecture, which is based on the work originated by John Day.

What I took from the whole session was the interesting work of Takeshi Usui (NICT/KDDI Laboratories) on the Virtual Network Mobility:Advanced Mobility Management over Network Virtualization [pdf] and Nao Kawanishi (ATR) on his vision of An Open Mobile Communication System with All Strata Virtualization [pdf].

I was pleasantly surprised by the symposium and people I meet at this event and the first sign of snow, which made the long trip back, via bus to Helsinki and then plane via London Heathrow and onto Dublin a worthwhile one.

Protection and Trust in Financial Infrastructures

September 24th, 2010


Not one our first projects to start in the FP7 programme, but our first project to finish. PARSIFAL was a coordination action, funded by the European Research Programme for Critical Infrastructure Protection. Its objective was to define how to better protect Critical Financial Infrastructures (CFI) in Europe.
There were a limited set of partners on the project ATOS Origin Sae, Spain (coordinators), ACRIS GmbH, Switzerland; @bc – Arendt Business Consulting, Germany; Avoco Secure Ltd, UK; EDGE International BV, Netherlands and of coures ourselves from the TSSG.
The key achievement of the project was to strengthened engagement between the European Commission and the Financial Services Industry in terms of trust, security and dependability. Financial Services are seen as a critical ICT infrastructures and so the purpose of this project was to provide direction for future research programmes, helping to align research in this area to the needs of the Financial Services Industry.
Parsifal has produced a whitepaper to highlight its acheivements [pdf].
There is also a document which gives some further details of the main research gaps in the area such as the classification of identity attributes for on-line and mobile users of financial services. The document points out this these identity attributes should be defined and well understood by providers of these services and their customers and in particular the:
3.1 Classification of identity attributes for online and mobile users
3.2 Trust Indicators for financial services to determine risk level
3.3 Multiple-identity management platforms
With the new dimension of cloud computing/architectural changes and de-perimeterization, can lead to new needs for standardization and regulations (flexible virtual concentration)
4.1 Standard and cross border digital identities in the financial market
4.2 Data-linked security policies
4.3 De-perimeterization of organizations: models and cross order issues:
5.1 Design and implementation of secure platforms and applications
5.2 Model Definition
For the full document read Section 3.1 of the Gap analysis report by clicking here
One of the main research items from the project has been the draft ontology of financial risks & dependencies within and without the Financial Sector (D2.1 – V2.0) [pdf].
The aim of the document is to contribute to a common understanding of the key concepts in risk management and financial infrastructures. It presents a simple model combining the ontologies from both the security and the financial sector.
There are ontologies in three work areas (business continuity, control engineering, trusted sharing of sensitive /confidential information). These ontologies lay the ground for further approaches, while one-page roadmaps illustrate the instant benefits of this approach.
ASimpleOntologyofDigitalIdentity.png
There is an extensive structured glossary in the document too. This glossary is based on a compilation of terms, available from public institutions (like the European Central Bank) or known experts. It includes more terms appearing in the other deliverables of the Parsifal project and being especially relevant to our context.
The main contributors to this work were J.-Yves Gresser, B. Haemmerli, S. Morrow, H. Arendt and Keiran Sullivan (TSSG), with Keiran leading a paper in the area “Risk ontologies – Security or Trust? Terminological & Knowledge Organisation”, TKE 2010, Sept. 2010.
All in all not a bad output from a humble CSA.

New assignment in scientific and technological cooperation across Europe

July 22nd, 2010

I’ve recently returned from beautiful Tallinn where the weather was hot and the city was beautiful. I was in Estonia for two items one was the COST-ICT Annual Progress Conference and the other was the COST-ICT domain committee meeting.

Before getting into the conference and meetings a little info on COST. Set up to run as an intergovernmental framework for European cooperation in science and technology, it allows for the coordination of nationally-funded research on a European level. COST does not fund research itself but provides a platform for European scientists to cooperate on a particular project and exchange expertise. These projects are called “Actions”.
A slide set overview of COST can be seen at this link [pdf].
My most recent experience of a COST action has been throught the ICT Action IC0703 TMA. The purpose of TMA is to host a community of researchers around the area of data traffic monitoring and analysis particularly looking at the theory, techniques, tools and applications for future networks. I have seen this community in full effect when collecting and identifying network traffic trace collections. Also Alan Davy has benefited from participating in a short-term scientific mission (STSM) in October 2009.
Anyway back to the Annual Progress Conference and from it I learned about a much broader community of projects such as

It was great to see Irish involvement in these Actions with a particular stand out being IC0804 well I would say this as John McLaughlin (TSSG) is participating in this energy efficiency action.
Now to the second part of this meeting, the COST-ICT domain committee meeting. I’ve played many sports in my time, and I always held out the hope that one day I would get to represent my country, I should have know better. But in the area of ICT well I’ve finally managed it.

I was recently nominated to represent Ireland on the COST ICT domain committee, and during the Estonian meeting that nomination was ratified. It’s a four year term and during that time I’ll get to interact more closely with a very vibrant and forward looking community. There will be responsibilities towards reviewing ongoing actions, new actions and also advising existing actions on their potential progress. This assignment (voluntary) will be a nice complement to my existing role in the TSSG.
I am looking forward to this journey and it couldn’t have started in a nicer place than Tallinn, Estonia.

FP6 has MORE than come to a close

September 17th, 2009

My first encounter of the FP6 IST programme was in Dublin Castle July 12th 2002 at its Irish launch event in Dublin castle. I remember the day well funnily enough as the there were unexpected roadworks in South Kilkenny that morning so once I made it (late) to the conference room in the Castle, it was packed with people and I got moved into one of those language translation booths, which was great, I had a higher viewing vantage point, a table and a very comfortable chair!
Well 7 years, and eight FP6 projects later and the FP6 IST programme has come to a final chapter for me, as the IST MORE project is now technically complete.

Although in fairness my involvement in the research and developments of IST MORE was peripheral as really Chris, Gemma, Chen, Kristian, Niall D., and a whole host of others helped bring the project from a grand vision for a “Network-centric Middleware for GrOup communication and Resource Sharing across Heterogeneous Embedded Systems” to a neatly designed software based middleware that hides the complexity of the underlying heterogeneity of embedded systems and provided a MORE simplified API and management mechanism.

And to prove the it is neat the MORE middleware helped integrate the management of a medical process in Hungry (for doctors and patients). It helped create a virtual organisation, allowing for chronic patients to be monitored continuously by sensors and accessed via mobile devices. The middleware in turn allowed for easy access to an on-line service for the doctors, diabetes patients and patients family to react to emergency situations, but of most benefit it was found that the implement system significantly decreased the number of necessary personal encounters between the doctor and the patient.
Watch the video below for further details

The project, just like any other framework programme project has tons of deliverables, but the one I want to point you towards is the document D5.1 Test Protocol [pdf], which details the building blocks for the MORE testing framework giving third-party developers a frame to test for correctness and compatibility. It also explicitly shows the test bed infrastructure for all the end user scenarios tested during the project, which is broken down into the laboratory environment test bed and the live field testing environment. The laboratory environment is complemented by an intensive test bed for performance evaluation, consisting of real world testing as well as simulation.
Finally the project source is available for your viewing, either head over to the MORESS SourceForge page or just use svn directly

svn co https://mores.svn.sourceforge.net/svnroot/mores mores