Month: March 2015

Internet of Things: Where Does the Data Go? | WIRED

THE INTERNET OF Things means different things to different people. To vendors, it’s the latest in a slew of large-scale trends to affect their enterprise customers, and the latest marketing bandwagon they have to consider. To enterprise organizations, it’s still a jumble of technical standards, conflicting opinions and big potential. For developers, it’s a big opportunity to put together the right mix of tools and technologies, and probably something they are already doing under another name. Understanding how these technologies work together on a technical level is becoming important, and will provide more opportunities to use software design as part of the overall business.

As Internet of Things projects go from concepts to reality, one of the biggest challenges is how the data created by devices will flow through the system. How many devices will be creating information? How will they send that information back? Will you be capturing that data in real time, or in batches? What role will analytics play in future?

These questions have to be asked in the design phase. From the organizations that I have spoken to, this preparation phase is essential to make sure you use the right tools from the start.

It is helpful to think about the data created by a device in three stages. Stage one is the initial creation, which takes place on the device, and then sent over the Internet. Stage two is how the central system collects and organizes that data. Stage three is the ongoing use of that data for the future.

For smart devices and sensors, each event can and will create data. This information can then be sent over the network back to the central application. At this point, one must decide which standard the data will be created in and how it will be sent over the network. For delivering this data back, MQTT, HTTP and CoAP are the most common standard protocols used. Each of these has its benefits and use cases.

HTTP provides a suitable method for providing data back and forth between devices and central systems. Originally developed for the client-server computing model, today it supports everyday web browsing through to more specialist services around Internet of Things devices too. While it meets the functionality requirements for sending data, HTTP includes a lot more data around the message in its headers. When you are working in low bandwidth conditions, this can make HTTP less suitable.

MQTT was developed as a protocol for machine-to-machine and Internet of Things deployments. It is based on a publish / subscribe model for delivering messages out from the device back to a central system that acts as a broker, where they can then be delivered back out to all of the other systems that will consume them. New devices or services can simply connect to the broker as they need messages. MQTT is lighter than HTTP in terms of message size, so it is more useful for implementations where bandwidth is a potential issue. However, it does not include encryption as standard so this has to be considered separately.

CoAP is another standard developed for low-power, low-bandwidth environments. Rather than being designed for a broker system like MQTT, CoAP is more aimed at one-to-one connections. It is designed to meet the requirements of REST design by providing a way to interface with HTTP, but still meet the demands of low-power devices and environments.

Each of these protocols support taking information or updates from the individual device and sending it over to a central location. However, where there is a greater opportunity is how that data is then stored and used in the future. There are two main concerns here: how the data is acted upon as it comes into the application, and how it is stored for future use.

Across the Internet of Things, devices create data that is sent to the main application to be sent on, consumed and used. Depending on the device, the network and power consumption restraints, data can be sent in real time, or in batches at any time. However, the real value is derived from the order in which data points are created.

This time-series data has to be accurate for Internet of Things applications. If not, then it compromises the very aims of the applications themselves. Take telemetry data from vehicles. If the order of data is not completely aligned and accurate, then it points to potentially different results when analyzed. If a certain part starts to fail in particular conditions – for example, a temperature drop at the same time as a specific level of wear – then these conditions have to be accurately reflected in the data that is coming through, or it will lead to false results.

Time-series data can be created as events take place around the device and then sent. This use of real-time information provides a complete record for each device, as it happens. Alternatively, it can be collated as data is sent across in batches – the historical record of data will be there, it just isn’t available in real time. This is common with devices where battery life is a key requirement over the need for data to be delivered in real time. Either way, the fundamental requirement is that each transaction on each device is put in at the right time-stamp for sorting and alignment. If you are looking at doing this in real time with hundreds of thousands or potentially millions of devices, then write-speed at the database level is an essential consideration.

Each write has to be taken as it is received from the device itself and put into the database. For more traditional relational database technologies, this can be a limiting factor, as it is possible for write-requests to go beyond what the database was built for. When you have to have all the data from devices in order to create accurate and useful information, this potential loss can have a big impact. For the organizations that I have spoken to around Internet of Things projects, NoSQL platforms like Cassandra provide a better fit for their requirements.

Part of this is due to the sheer volume of writes that something like Cassandra is capable of; even with millions of devices that creating data all the time, the database is designed to ingest that much data as it is created. However, it is also due to how databases themselves are designed. Traditional databases have a primary-replica arrangement, where the lead database server will handle all the transactions and synchronously pass them along to other servers if required. This leads to problems in the event of an outage or server failure, as a new primary has to be put into place leading to a potential data loss.

For properly configured distributed database systems like Cassandra, there is no ‘primary’ server that is in charge; each node within a cluster can handle transactions as they come in, and the full record is maintained over time. Even if a server fails, or a node is removed due to loss of network connectivity, the rest of the cluster can continue to process data as it comes in. For time-series data, this is especially valuable as it means that there should be no loss of data in the list of transactions over time.

Once you have this store of time-series data, the next opportunity is to look for trends over time. Analyzing time-series data provides the opportunity to create more value for the owners of the devices involved, or carry out automated tasks based on a certain set of conditions being met. The typical example is the Internet-connected fridge that realizes it is out of milk; however, Internet of Things data is more valuable when linked to larger private or public benefits, and with more complex condition sets that have to be met. Traffic analysis, utility networks and use of power across real estate locations are all concerned with consuming data from multiple devices in order to spot trends and save money or time.

In this environment, it’s helpful to think about when the results of the analytics will be required: is there an immediate, near real-time need for analysis, or is this a historic requirement? The popularity of Apache Spark for analysis of big data and Spark streaming for in near real time has continued to grow, and when combined with the likes of Cassandra it can provide developers with the ability to process and analyze large, fast-moving data sets alongside each other.

However, this is not just about what is taking place right now. The value from time-series data can come over time just as well. As an example, i2O Water in the UK looks at information around water pressure, taken from devices that are embedded in water distribution networks around the world. This data has been gathered over two years and is stored in a Cassandra cluster. The company uses this information for its analytics and to alert customers around where maintenance might be needed.

This data has its own value for the company. It has a ready-made source of modeling and analytics information for customers that can be used around new products too. This is down to the interesting way that the company has architected its applications in a modular fashion; when a new module or service is added, the time-series data can be “played” into the system as if the data was being created. This can then be used for analytics and to show how the devices on the water network would have reacted to the variations in pressure or other sensor data during that time.

For i2O Water, the opportunity here is to add services that demonstrate more value back to the utility companies that are customers. The value of water will only increase as more people need access, which in turn makes accurate and timely data more valuable. This is a good example of how connecting devices and data can improve lives as well as create new opportunities for the companies involved.

The ability to look back at time-series data has the most far-reaching consequences for the Internet of Things as a whole. Whether it’s for private sector gain or public sector good, the design of the application and how that data is stored over time is essential to understand. When designing for the Internet of Things, the role of distributed systems that can keep up with the sheer amount of data being created is also important.
As found on…

How old were you when you first felt empathy?

Screen Shot 2015-03-01 at 12.33.04 PM

There were many highlights for me last week but one has me compelled to share.

Backstory: About two months back my son (6), on his own doing, decided to open a juice stand in front of his Grandma and Grandpa’s home. His business model was solid; have someone else pay for the fruit and he provides the labor. What had me proud was not the fact that he crafted an entrepreneurial venture while others his age were playing Wii or XBox; it was the fact that he took it upon himself to use this juice stand to help promote and raise money for Project Hope Alliance.

Our family joined the Project Hope Alliance family in 2014, after I joined their Board of Directors. The Mission, accomplishments and highly passionate team spoke to me and my family. The rapidly growing organization embodies an entrepreneurial spirit rarely found in 40 year old non-profits.

Last week, Suzi Diaz from Project Hope Alliance visited with the kids of my son’s first grade class. She explained how she had the coolest job in the world and was fortunate enough to help kids move into new homes with their families. She explained to the room of 50 or so 6 year olds that in Orange County alone there are nearly 30,000 kids, who look and act just like them, who are without homes.



What happened next was simply inspiring. Suzi asked the group a simple question, “has anyone ever done something for you that made you feel good?”.

Hands shot up as one kids after another shared stories about how other kids helped them up after they fell at school, or told them that they liked their haircut.

Suzi then asked if they have ever made someone ELSE feel good by saying nice things or helping someone do something they couldn’t do alone.

Hands shot up and one by one as the kids demonstrated how they have helped others. It was then I realized that this room of first graders not only understood but demonstrated already at this young age, empathy.

As Susi went on to explain how Project Hope Alliance helps kids and their families end the cycle of homelessness, I could see one kid after another (and teachers too) connect the dots between empathy and action; doing good by others makes me feel good too.

I am proud of my son for feeling and showing empathy- and taking action.

I am proud of Project Hope for helping our next generation awaken the spirit of empathy that lives within.

2015-02-19 20.58.46

To learn more about what moves the CEO of Project Hope, watch this amazing video

To learn more about Project Hope Alliance or to have Suzi and team come speak to your kids’ school, visit:

Ushering in a New Era of Intelligent Building and Energy Management « A Smarter Planet Blog A Smarter Planet Blog

By Joe Phillips
As you sit in your office reading this story, consider this: you’re surrounded by data.
Computers, lights, power strips, air conditioning, elevators, alarms and meters – all of this is generating data inside the building. This data can reveal powerful information to make offices, campuses and large buildings work better.
While the Internet of Things has entered the building, this explosion of data constantly reports out on what’s going on, but often it’s not easy to use. Many organizations don’t see or take advantage of data as well as they could. They often operate on a system-by-system, building-by-building basis with little correlation to business outcomes.
A broader approach is necessary, one that integrates the facilities portfolio as closely as possible to the business needs. To tackle this problem and address the concern that by 2025 buildings will be the top consumers of energy worldwide, IBM is announcing an innovative project with Carnegie Mellon University to deliver a cloud-based analytics system for reducing energy and facility operating costs.
YouTube Preview Image
With 6.5 million square feet of infrastructure, miles of underground utilities, water lines, electrical systems, health facilities, restaurants and even its own police force, Carnegie Mellon is practically a city unto itself. In fact, it would be in the top five percent of municipalities in Pennsylvania if it were an incorporated town.
By harvesting intelligence, best practices and value from the big data of buildings, the university expects to save approximately 10 percent on utilities — nearly $2 million annually — when the IBM system is fully deployed across 36 buildings on its Pittsburgh campus.  This is a campus where the first building was built in 1906, and the most recent building is under construction now. More than a hundred years of infrastructure can all be managed through a single system using the new IBM Building Management Center delivered on the IBM SoftLayer cloud to monitor thousands of data points from building automation and control systems to deliver better building performance, energy efficiency and space utilization.
Joe Phillips, IBM Smarter Buildings Leader
Joe Phillips, IBM Smarter Buildings Leader
Data and information management are the new tools of facility management. Once an organization has the right data in the right hands, it can enter a new era where managers learn things about buildings that couldn’t be seen before. Additionally, making buildings work better can lift the bottom line for businesses.  Facilities operation and management is typically the second biggest cost center for most companies, after payroll.
Armed with powerful new information from Big Data and analytics, maybe one day buildings will no longer use 42 percent of our energy supply or be the number one contributor to CO2 emissions.
By the way, be sure to turn the lights off when you leave.

Tim O’Reilly: Silicon Valley is massively underestimating the impact of IoT (interview)

Smart forks. Smart tennis rackets. Smart toothbrushes. Smart teddy bears. Smart fitness bands.

The Internet of Things hype is inspiring an endless stream of connected gadgets that light up Kickstarter campaigns and pack the halls of consumer electronics conferences.

But Tim O’Reilly, the technology publisher and Silicon Valley guru, is worried that these products have hijacked the conversation about IoT and distracted people by making them focus on what amount to novelty items.

It’s frustrating for O’Reilly, one of the sharpest observers of tech trends for the past three decades, because he sees something about to happen on a grand, unprecedented scale and he worries that people aren’t discussing it enough. He believes that the coming IoT era will result in more sweeping changes to our lives, our work, and our communities than those brought about by the eras of the PC, Internet or smartphones.

“Obviously, Silicon Valley is all over this,” O’Reilly said. “But I think they are missing the point. They are creating some gadgets, but they aren’t thinking about systems.”

This massive disruption. O’Reilly said, will be powered by elements that are well-known. Powerful, low-cost sensors that become ubiquitous; mass adoption of mobile gadgets that serve as hubs for IoT devices; blazing fast wireless networks; and all the data these things will generate.

But rather than talk about a smartwatch that monitors fitness and activity, O’Reilly wants people thinking about how to disrupt the entire healthcare system. How can healthcare be reimagined if people’s health information is not only monitored, but immediately shared with doctors and nurses. Can each health care provider serve 10 times, or 100 times, as many people, and improve the quality of care while sharply reducing the cost?

Rather than think about smart cars, O’Reilly wants to talk about changing entire parking and traffic systems throughout cities to send information to drivers that reduce the number of cars, cut the time spent looking for things like parking, and prevent the need to build new roads. Can information generated by cities and cars to radically improve transportation in ways that save money and help the environment?

The example that O’Reilly cites of the massive disruptive potential of IoT is one that might surprise people: Uber.

While most people wouldn’t think of Uber as an IoT company, O’Reilly says that is the problem. Uber represents the kind of systematic change that interests him, a change that doesn’t just focus on sticking a sensor in a gadget.

“This company isn’t put in this category,” O’Reilly said. “But it’s teaching us a lot.”

The first lesson is about the way the very nature of work can change. Certainly, Uber has generated a lot of controversy in that respect. But O’Reilly says the flow of information created by Uber is revolutionary in that it allows drivers to decide when and how long they want to work, and what they can earn.

“It changes the whole workflow,” O’Reilly said. “And it changes the way we think about these things. You empower workers and they can have more flexibility.”

He said Uber is also disrupting payments, more so than even much-hyped services like Apple Pay, O’Reilly says. With Apple Pay, you remove the need to take out one payment devices, a credit card, and replace it with the need to take out another payment device, a smartphone or Apple Watch.

But with Uber, once the service is booked, payment will just happen when it’s over. No need for another action by consumers. O’Reilly sees a day when connected gadgets allow for payment systems where stores and machines simply recognize people and conduct a whole transaction automatically.

“What Uber is doing with payments may be more important in the long run than Apple Pay,” O’Reilly said. “Apple Pay recreates the old workflow, just with a new device. It would be revolutionary to say we don’t need that at all.”

O’Reilly unerstands that at the moment, IoT prompts more than a little bit of eye-rolling in a tech industry that has grown a bit weary of hearing about it, and the parade of new devices everyone is hawking. It’s not quite an IoT bust, but there’s certainly a fair bit of fatigue that reminds O’Reilly of the years after the dot-com bust.

Back then, people didn’t want to talk any more about the revolutionary potential of the Internet, something that many believed had been oversold to a public and creating the dot-com bubble that burst and took the U.S. economy down with it for a time.

But that’s exactly why O’Reilly says he wants to reboot the conversation now, to energize people who will think more broadly about the potential of IoT, and will develop the systems and platforms that will make it all happen.

“This wave of technology has more chance of re-imagining whole swathes of the world than anything we’ve seen before,” O’Reilly said. “This is really going to disrupt everything.”

Soon You’ll Be Able to Buy IKEA Furniture That Charges Your Electronics Wirelessly

Hate the dreaded task of locating and then plugging your various chargers into the wall? IKEA’s on it.

On Sunday, the Swedish furniture company announced it is rolling out a collection of tables, desks and lamps embedded with wireless charging pads that can power up electronics through an energy induction transfer, rendering the charger unnecessary.

IKEA will also sell wireless charging kits that can be built into existing furniture for 30 euros, or about $33, according to the The Wall Street Journal.

Both the furniture and the charging kits are slated to hit stores in April.20150302165632-ikea-homesmart-chargers