Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.
by: Bsomich
18  May  2015

MIKE2.0 Community Update

Click to view this email in a browser

 

 

Missed what’s been happening in the MIKE2.0 data management community? Check out our latest update:

 logo.jpg

How Do You Define Your Master Data? 

There are numerous definitions for “master data” ranging from one sentence to a few paragraphs.  This is perhaps the most straightforward one I’ve come across:

Master data is the core data that is essential to operations in a specific business or business unit. - via Whatis.com

A clear and simple definition, yet a lot of companies often struggle to adhere to it when identifying and qualifying master data for their organizations.

Why do you think this is?

Although data is often looked at on a transactional basis, master data typically makes up a large a percentage of the data elements in any given transaction. Common examples of master data include:

  • Customer data (name, contact details, DOB, customer classification)
  • Locality data (physical address, postal address, geographical data)
  • Product data (item number, bill of materials, product codes)
  • Employee data (employee number, role, placement in organisational structure)
  • Partner data (partner name, classification)

It is not unusual for this same data to be held in dozens or even hundreds of applications across a large organization, and may be difficult to isolate and collect.   Much of the data has been held in legacy systems for years and may be held in a fashion where data is poorly integrated and at low levels of quality.  Many organizations have poorly implemented Data Governance processes to handle changes in this data over time.

MIKE2.0 offers an open source solution for managing master data that outlines many of the issues organizations have with identifying it.

We hope you find this offering of benefit and welcome any suggestions you may have to improve it.

Sincerely,

MIKE2.0 Community

Popular Content

Did you know that the following wiki articles are most popular on Google? Check them out, and feel free to edit or expand them!

What is MIKE2.0?
Deliverable Templates
The 5 Phases of MIKE2.0
Overall Task List
Business Assessment Blueprint
SAFE Architecture
Information Governance Solution

Contribute to MIKE:

Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
 images.jpg

 

This Week’s Blogs for Thought:

5 Unusual Ways Businesses are Using Big Data

Big data is where it’s at. At least, that’s what we’ve been told. So it should come as no surprise that businesses are busy imagining ways they can take advantage of big data analytics to grow their companies. Many of these uses are fairly well documented, like improving marketing efforts, or gaining a better understanding of their customers, or even figuring out better ways to detect and prevent fraud. The most common big data use cases have become an important part of industries the world over, but big data can be used for much more than that. In fact, many companies out there have come up with creative and unusual uses for big data analytics, showing just how versatile and helpful big data can be.

Read more.

Cloud Computing and the Industries that Love It

Cloud computing provides greater security, virtually unlimited computing resources for research and development, cost savings, and advanced threat detection methods. With so many reasons to use cloud computing, it’s no wonder many industries have flocked to the new technology. Cloud technology serves as a form of outsourcing for companies, where some data is kept in house for better control, and other data is trusted to a third-party provider. Each industry that benefits from cloud computing has their own specific reasons for adopting the technology, but cloud computing is most profitable for companies that work with emerging markets and need quick and cost effective scalability.

Read more.
Is Your Data Quality Boring? 

Let’s be honest here. Data Quality is good and worthy, but it can be a pretty dull affair at times. Information Management is something that “just happens”, and folks would rather not know the ins-and-outs of how the monthly Management Pack gets created. Yet I’ll bet that they’ll be right on your case when the numbers are “wrong.” Right?

So here’s an idea. The next time you want to engage someone in a discussion about data quality, don’t start by discussing data quality. Don’t mention the processes of profiling, validating or cleansing data. Don’t talk about integration, storage or reporting. And don’t even think about metadata, lineage or auditability.

Read more.

Forward this message to a friendQuestions?

If you have any questions, please email us at mike2@openmethodology.org. 

 

Category: Information Development
No Comments »

by: Bsomich
12  May  2015

Defining Master Data

There are numerous definitions for “master data” ranging from one sentence to a few paragraphs.  This is perhaps the most straightforward one I’ve come across:

Master data is the core data that is essential to operations in a specific business or business unit. - via Whatis.com

A clear and simple definition, yet a lot of companies often struggle to adhere to it when identifying and qualifying master data for their organizations.

Why do you think this is?

Although data is often looked at on a transactional basis, master data typically makes up a large a percentage of the data elements in any given transaction. Common examples of master data include:

  • Customer data (name, contact details, DOB, customer classification)
  • Locality data (physical address, postal address, geographical data)
  • Product data (item number, bill of materials, product codes)
  • Employee data (employee number, role, placement in organisational structure)
  • Partner data (partner name, classification)

It is not unusual for this same data to be held in dozens or even hundreds of applications across a large organization, and may be difficult to isolate and collect.   Much of the data has been held in legacy systems for years and may be held in a fashion where data is poorly integrated and at low levels of quality.  Many organizations have poorly implemented Data Governance processes to handle changes in this data over time.

MIKE2.0 offers an open source solution for managing master data that outlines many of the issues organizations have with identifying it.

How do you define and qualify your master data?

Category: Information Development
No Comments »

by: Robert.hillard
25  Apr  2015

Experts make better decisions with an understudy

Most of us are experts at something.  An expert is someone who can reliably assess a situation and apply an appropriate advanced skill or technique.  Knowing what skill to apply and when is just as important as the technical capability that is applied.  Examples include medical specialists deciding whether to operate and, if so, how.

Knowing what skill to apply requires data.  For doctors, this is usually in the form of symptoms, for accountants, it’s the financial results and for engineers it’s the telemetry that is generated by almost all of the infrastructure that now surrounds us.

Understanding how we use data is really important.  Knowledge Management experts talk about tacit versus explicit knowledge.  The former is often hard to document or clearly communicate.  Yet, tacit does not imply that it is not based on data, but it is often using a complex combination of the facts at hand combined with the experience of the practitioner.

Even the best knowledge systems can’t match the interpretation of the data that the tacit knowledge of experts can achieve.  Although Big Data analytics solutions are making good progress they can’t make the sort of expert cognitive leaps that we rely on for some of our most critical decisions (see Your insight might save your job).  It’s going to be a while before our General Practitioner is replaced by a computer.

But how good are the decisions that experts make?  If the interpretation of the results is unambiguous then it is likely that an alert and capable expert will make the right decision.  Their choices can be validated by a second-in-charge such as a co-pilot in an airplane cockpit with a consensus almost certain.  But these are the sorts of decisions that are most at risk of automation.  What about those decisions that are dealing with imperfect data, ambiguous symptoms or the convergence of apparently unrelated issues?

Teaching makes us better experts

When we teach we challenge ourselves.  Many years ago, I had my first opportunity to teach students in my own discipline of data management.  At that point in my career I was already considered an expert and I was very used to delivering expert advice to clients.

What changed when I had to teach was the need to provide evidence and references.  In doing so, I was forced to critically examine my decision making process.  While my overall approach didn’t change, I found myself being more formal in the way I referenced my client work and I tried to not only satisfy my clients but also consider what my students would ask.

Open talent

There is a lot of talk around open talent models. With the likely result that organisations can access the global expert who can answer their specific question.  This is happening across the board including disciplines such as management consulting, engineering, accounting, law and even medicine.

For many tasks this makes perfect sense.  An expert who can review the data and provide a specific answer, recommendation or diagnosis is incredibly valuable.  With social networks, finding such an expert is sometimes only a few clicks away even for the most obscure but specific facts to be reviewed.

I would argue, however, that if the expert is simply providing an answer to a specific question, then ultimately the expert’s role will be automated in the future.  Not only do these sorts of experts face redundancy through automation but even when they are using their skills to provide insight they are operating in a vacuum.  Their ideas go largely unchallenged and are not developed further.

The value of mentoring

Compare that situation to a practitioner who is working with a younger group who they are mentoring or teaching.  The questions they will be asked force them to evaluate their whole approach and, on occasion change their view.

This is the reason why teaching and research go hand-in-hand.  It isn’t only the labour capacity that students and junior staff provide, it is also the perspective that they either bring to the table or that they trigger in their supervisors.

In my own field of Management Consulting, this is the most important function of graduates and junior staff.  They offer a refreshing perspective.  They assume that there are no dumb questions and are eager to learn.  In their eagerness, they don’t hesitate to question established orthodox perspectives.

This is the reason I am an optimist about the future for so many of our professions.  Despite the threat of automation and the enthusiasm for offshoring to a few experts, the really good decisions are usually made by experts who are surrounded by teams who are eager to learn from them.  There will be a role for this staffing model for many years to come.

Category: Enterprise2.0
No Comments »

by: Jonathan
17  Apr  2015

5 Unusual Ways Businesses are Using Big Data

Big data is where it’s at. At least, that’s what we’ve been told. So it should come as no surprise that businesses are busy imagining ways they can take advantage of big data analytics to grow their companies. Many of these uses are fairly well documented, like improving marketing efforts, or gaining a better understanding of their customers, or even figuring out better ways to detect and prevent fraud. The most common big data use cases have become an important part of industries the world over, but big data can be used for much more than that. In fact, many companies out there have come up with creative and unusual uses for big data analytics, showing just how versatile and helpful big data can be.

1. Parking Lot Analytics

Every business is trying to gauge how well they are doing, and big data is an important part of that. Perhaps some study the data that comes from their websites, or others look at how effective their marketing campaigns are. But can businesses measure their success by studying their parking lots? One startup is doing that very thing. Using satellite imagery and machine learning techniques, Orbital Insight is working with dozens of retail chains to analyze parking lots. From this data, the startup says it can assess the performance of each company without needing further information. Their algorithm uses deep learning to delve into the numbers and find unique insights.

2. Dating Driven By Data

Big data is changing the way people date. Many dating websites, like eHarmony, use the data they compile on their users to come up with better matches, increasing the odds they’ll find someone they’re compatible with. With open source tools like Hadoop, dating sites can gain detailed data on users through answers to personal questions as well as through behaviors and actions taken on the site. As dating sites collect more data on their customers, they’ll be able to more accurately predict who matches well with whom.

3. Data at the Australian Open

Many sports have adopted big data to get a better understanding of their respective games, but big data is also being used in a business sense in the sports world. The Australian Open relies heavily on big data during the tournament in response to the demands of tennis fans around the world. With big data, they can optimize tournament schedules and analyze information like social media conversations and player popularity. From there, the data is used to predict viewing demands on the tournament’s website, helping organizers determine how much computing power they need at any given time.

4. Dynamic Ticket Pricing

The NFL is also using big data analytics to boost their business. While it might seem like the NFL doesn’t need help in this regard, they still want to use big data to increase ticket sales. The goal is to institute variable ticket pricing, which has already been implemented by some teams. Using big data, NFL teams can determine the level of demand for specific games based on factors like where it falls in the season, who the opponent is, and how well the home team is playing. If it’s determined demand is high, ticket prices will go up. If demand is predicted to be low, prices will go down, hopefully increasing sales. With dynamic ticket pricing, fans wouldn’t have to pay high prices for games that are in low demand, creating more interest in the product, especially if a team is struggling.

5. Ski Resorts and Big Data

Many ski resorts are truly embracing the possibilities of big data. This is done through basic ideas, like saving rental information, but it can also be used to prevent ticket fraud, which can take out a good chunk of revenue. Most impressively is how big data is used to increase customer engagement through the use of gamification. With Radio Frequency Identification (RFID) systems, resorts can actually track skiers, compiling stats like number of runs made, number of feet skied, and how often they get to the slopes. This data can be accessed on a resort’s website where skiers can compete with their friends, earning better rankings and rewards which encourage them to spend more time on the slopes.

These cases show that with a bit of creative thinking, big data can help businesses in more ways than one. As companies become more familiar working with big data, it’s easy to see how unique and innovative solutions will likely become the norm. As unusual as some of these uses may be, they may represent only the beginning of many unique ventures in the future.

 

Category: Business Intelligence
No Comments »

by: RickDelgado
15  Apr  2015

Cloud Computing and the Industries that Love It

Cloud computing provides greater security, virtually unlimited computing resources for research and development, cost savings, and advanced threat detection methods. With so many reasons to use cloud computing, it’s no wonder many industries have flocked to the new technology. Cloud technology serves as a form of outsourcing for companies, where some data is kept in house for better control, and other data is trusted to a third-party provider. Each industry that benefits from cloud computing has their own specific reasons for adopting the technology, but cloud computing is most profitable for companies that work with emerging markets and need quick and cost effective scalability.

The Chemicals Industry

Chemical companies are being driven to improve their flexibility, reduce costs, improve speed and become more responsive. Cloud computing provides this by transforming chemical businesses into thriving, cloud-based digital businesses. Chemical companies must be prepared to penetrate new markets quickly. Higher speeds and greater visibility is also a continually evolving need in this industry as collaboration becomes more important than ever.

As governments continue to push for green legislation, chemical companies often find themselves in the crosshairs of local and federal government regulating industries. Switching to cloud-based providers offers one way to increase accountability and reduce resource consumption. Additionally, as cost efficiency becomes increasingly important, chemical companies love the fact that cloud computing provides greater operational agility and increased cost savings across the entire industry.

Chemical companies use IaaS and SaaS in an effort to control costs and use virtualization to create private cloud architectures for their businesses. For Example, Dow Chemical met the requirements for 17 European countries by moving its operations to the cloud.

Law Firms

Law firms deal with large amounts of data on a regular basis, and they need to ensure that the data stays safe. In the past, law firms have mainly relied upon in-house servers to manage their operations. As the expense for maintaining the computers, servers, software and hiring IT administrators has grown, cloud computing has become an attractive alternative.

Even the simplest building closure can put a law firm out of reach from its data. This can seriously hamper an attorney’s ability to effectively manage clients and maintain a high level of service. By moving data to the cloud, law firms have become more prepared to deal with disasters and can reduce the possibility of being without crucial data. Lawyers must also increasingly work outside the office to meet with clients and maintain a high level of effectiveness. Cloud computing solves the problem by making the issue of accessing content securely while away from the office a much simpler, and safer endeavor.

There are unique ethical considerations that any law firm must consider when entrusting its data to a third-party. Law firms can maintain control of their data, while still utilizing cloud servers for advanced threat defense, security and applications that are non-sensitive. While law firms must take due diligence and talk specifically with their cloud provider about data center locations, how data is treated, encryption levels, and their duties in the event of a subpoena, the move to the cloud offers a chance for greater efficiency, reliability and cost savings.

Startup Communication Companies

The trend with new startups is to get going quickly, and define the work at a later date. Many startups don’t have a clear mission plan, and rely upon data received from initial product launches to determine the direction a company will take. Startups are in a unique position to be extremely flexible and adaptable. Established companies generally have preferred software applications, complex networking arrangements and a system that requires careful planning before any changes are made to the infrastructure. With a startup, the entire structure of the company can be taken down and rebuilt in a single business day, this makes working in the cloud a dream for new companies.

No industry knows the importance of flexibility more than the communications industry. With new technologies being developed and emerging daily, it’s become increasingly important to have a dynamic and scalable workspace for research and development. The cloud provides an ideal environment for companies that may need terabytes of data one day and a few gigabytes the next. The cloud provides an ideal situation where resources can be effectively managed without having to upgrade hardware, invest in costly data centers and hire several IT administrators to keep things running smoothly.

Government Agencies and Law Enforcement

Government agencies, including the CIA, FBI and local law enforcement are continually evaluating the cloud architecture to determine ways it can be utilized to increase efficiency, manage multiple departments and improve mobility. Governments largely deem that cloud computing is a safe alternative to traditional in-house servers as it provides advanced threat detection and a high degree of security.

Cross-agency cooperation is essential for governments that need to share information on a state and federal level. By keeping information available in the cloud, state agencies can work more effectively with federal authorities. This makes it possible to share information quickly, and improve the ability to stop an advanced threat before it causes harm. Governments can use public cloud services for less critical information, and a private cloud service for the most sensitive data. This provides the best of both worlds.

The Future of Cloud Computing

Any industry that needs a highly secure, adaptable and scalable computing environment can benefit from cloud computing. From the music industry to the local startup that is still defining its purpose, cloud computing can reduce costs, improve efficiency and increase security for any company. As governments continue to impose strict fines and penalties for failing to maintain good security practices, it has become more important than ever to safeguard company and customer information. The cloud does this at a low cost and with great flexibility.

 

Category: Enterprise Data Management
No Comments »

by: Bsomich
10  Apr  2015

MIKE2.0 Community Update

Missed what’s been happening in the MIKE2.0 data management community? Read on:

  
 logo.jpg

Community Announcement: MIKE2.0 and DAMA-International

This month, the MIKE2.0 Governance Association (MGA) is pleased to announce an agreement with DAMA-International commencing the transition of the MIKE2.0 community to DAMA-International. The agreement was announced last week at the DAMA-I Chapter Meeting at Enterprise Data World 2015 (EDW 15) in Washington, DC.

Under the transition, the MIKE2.0 community will now be provided by the existing DAMA-I Chapter structure and related member services and activities for the continued development and extension of the MIKE2.0 framework. “We at DAMA-I are delighted and honored to have the opportunity to progress and build upon the combined wisdom within MIKE2.0 and DAMA DMBOK. Merging practical with theoretical provides the ultimate approach to managing data,” said Sue Geuens, President of DAMA International, in a recent interview at Enterprise Data World.

After nearly a decade of community operations, the MIKE2.0 Governance Association is looking forward to the start of a new chapter under DAMA.  By solidifying this relationship, MGA is able to guarantee that all registered users and contributors to the MIKE2.0 project will continue to have an active community within which they can continue their professional networking, skills development, and intellectual contributions while utilizing and building the MIKE2.0 framework for information management.

“We are excited to reach an agreement with DAMA to solidify a sustainable future for MIKE2.0,” said Brenda Somich, community manager for MIKE2.0. “After years of dedication to this initiative, our team is grateful to know that the community will continue to expand and grow. As with the constant change and evolutionary nature of information, we are happy to announce that MIKE2.0 will also evolve.”

For any inquiries about MIKE2.0 and the acquisition by DAMA, please contact Rob Hillard, MGA board member and co-founder of MIKE2.0.

Sincerely,MIKE2.0 Community

Contribute to MIKE:

Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
images.jpg

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.

 

This Week’s Blogs for Thought:

5 Challenges Facing the Internet of Things 

Our constant need to be connected has expanded beyond smartphones and tablets into a wider network of interconnected objects. These objects, often referred to as the Internet of Things (IoT), have the ability to communicate with other devices and are constantly connected to the internet in order to record, store and exchange data. This idea of an “always on, always connected” device does seem a little big brother-ish, but there are definitely some benefits that come with it.

Read more.

The Change You Can’t See: What’s Your Horse Carcass?

I had the pleasure this month of launching the Australian edition of Deloitte’s Tech Tends 2015 report.  For a number of years now, we’ve put our necks on the line to predict what will happen in the immediate, and slightly longer, term.  Looking back over recent years, we can see the rise of cloud, the reinvention of the role of core IT systems and the evolution of management of technology in business.Read more.
The Debate Continues: The Future Impact of Net Neutrality on the Cloud

The debate over Net Neutrality is far from over. While the recent ruling by the FCC to classify broadband internet as a public utility may have changed the argument, debates will undoubtedly still continue to take place. The effects the decision has on the web will likely not be felt, let alone understood, for many years to come, but that hasn’t stopped speculation over what a neutral internet will actually look like and how companies and internet service providers (ISPs) will be impacted.

Read more.

  Forward this message to a friend

 

 

Category: Information Development
No Comments »

by: Bsomich
01  Apr  2015

Community Announcement: MIKE2.0 and DAMA-International

FOR IMMEDIATE RELEASE

Contact:
MIKE2.0 Governance Association
Pfingstweidstrasse 60, CH-8050
Zürich, Switzerland

March 31, 2015 – Zürich, Switzerland - This week, the MIKE2.0 Governance Association (MGA) is pleased to announce an agreement with DAMA-International commencing the transition of the MIKE2.0 community to DAMA-International. The agreement was announced on Monday evening at the DAMA-I Chapter Meeting at Enterprise Data World 2015 (EDW 15) in Washington, DC.

Under the transition, the MIKE2.0 community will now be provided by the existing DAMA-I Chapter structure and related member services and activities for the continued development and extension of the MIKE2.0 framework. “We at DAMA-I are delighted and honored to have the opportunity to progress and build upon the combined wisdom within MIKE2.0 and DAMA DMBOK. Merging practical with theoretical provides the ultimate approach to managing data,” said Sue Geuens, President of DAMA International, in a recent interview at Enterprise Data World.

 About MIKE2.0

With 869 articles and 29,675 members contributing their knowledge and experience, Method for an Integrated Knowledge Environment (MIKE2.0) is an online community providing a comprehensive methodology that can be applied across a number of different projects within the Information Management space. It was released under open source in 2006 although earlier efforts of the project date back to 2003. The MIKE2.0 Governance Association (MGA) has been the governing body since 2009. In 2013, the MGA team authored Information Development using MIKE2.0, translating many of the community’s core content assets to print to better reach an offline audience.

After nearly a decade of community operations, the MIKE2.0 Governance Association is looking forward to the start of a new chapter under DAMA.  By solidifying this relationship, MGA is able to guarantee that all registered users and contributors to the MIKE2.0 project will continue to have an active community within which they can continue their professional networking, skills development, and intellectual contributions while utilizing and building the MIKE2.0 framework for information management.

“We are excited to reach an agreement with DAMA to solidify a sustainable future for MIKE2.0,” said Brenda Somich, community manager for MIKE2.0. “After years of dedication to this initiative, our team is grateful to know that the community will continue to expand and grow. As with the constant change and evolutionary nature of information, we are happy to announce that MIKE2.0 will also evolve.”

For any inquiries about MIKE2.0 and the acquisition by DAMA, please contact Rob Hillard, MGA board member and co-founder of MIKE2.0.

Category: Information Development
No Comments »

by: Robert.hillard
28  Mar  2015

The change you can’t see, or what’s your horse carcass?

I had the pleasure this month of launching the Australian edition of Deloitte’s Tech Tends 2015 report.  For a number of years now, we’ve put our necks on the line to predict what will happen in the immediate, and slightly longer, term.  Looking back over recent years, we can see the rise of cloud, the reinvention of the role of core IT systems and the evolution of management of technology in business.

Interestingly, the time we’ve been doing these predictions in this particular format has coincided with a peculiar period in computing history when most of the innovation found in business started life in the consumer world.  To a large degree, this trend is the result of the boom of smartphones in the late naughties.

This is not the long-term norm.  Over the five decades since computing became an important part of business technology, the enterprise led and hobbyists had to pick up the scraps until they could get the price to a point where a product that appealed to consumers could be mass produced.

The return of the enterprise

This year has seen a return to the norm.  Some of the hyped technologies like 3-D printing turn out to have more application in business than the home.  Freed from looking to consumers, business has renewed confidence to innovate from the ground-up.  This, in-turn, has the potential to accelerate innovation and enable disruptive rather than evolutionary trends.

It is often hard to move consumer technologies quickly when big investments are required.  To get a return on this capital, large numbers of people need to be moved to open their wallets to get an acceptable return.  Enterprise solutions, on the other hand, can focus on niche problems without needing to worry about standards or mass movements of people due to the rapid return on capital.

It’s entirely possible, for instance, that big business investments in autonomous vehicles will have the same impact as large-scale manufacturing did on advancing robotics in the 1970s and 1980s.  The renewed focus is particularly evident in mining where large distances and controlled environments make early investment not only possible but also rapid returns feasible.

This exciting new period of innovation leads us to ask whether we are really planning for our society of the future or if we are limiting our thinking.  The answer matters because there is a lot of money being invested by both governments and business based on their current assumptions on what the future will bring.

Horse carcasses

In the late nineteenth century, city planners were dealing the exponential growth in populations and wealth.  They were planning how to deal with one of the most visible forms of technology in every street: the horse.  In the last part of the 1800s, New York City had nearly 200,000 horses.  With the tough conditions they worked under, many horses could expect to live just two to four years with the carcasses being a problem on par with the food they required and the manure they produced.

The Times of London famously predicted around this time that every street would be buried under nine feet of horse-generated waste by 1950!

It’s no wonder that these were the problems that city planners thought they would be dealing with through the then-new twentieth century.  Of course, the arrival of the motor car was both foreseeable by anyone looking back at the history of the internal combustion engine and almost unforseen by city planners of the time.

I wonder whether there are horse carcasses that we simply can’t look past when planning our society of the next century.

Reinventing our cities

A candidate list of these “horse carcasses” would have to start with transport, the issue that planners of the turn of the last century couldn’t look past.  Debate rages in cities around the world about the amount of investment that should go into roads and public transport (predominantly rail).  At the same time, the autonomous car, which seemed a dream just a few years ago, is very much on the verge of reality.

Autonomous vehicles, operating in concert and optimised through the power of analytics, can increase the density of road traffic by an order of magnitude, allowing cities to utilise existing roads with little need to upgrade capacity for many decades.  Similarly, a considered approach to shared resources means that public transport as we know it today is effectively rendered obsolete.

Imagine a future where you simply press a button on your smart device and a vehicle takes you where you want to go.  No waiting, timetables or congestion.

Reinventing our society

Any serious vision of the future has to consider economics and the future of growth.  For the vast majority of the history of humanity the underlying economic growth rate has been a fraction of one percent compared to the high growth achieved since the industrial revolution.

The big question is whether society should build-in assumptions about growth and the only tool to achieve it: technology.  Looking back, from the vantage of the late twenty first century, the answer will seem obvious and we will kick ourselves if we haven’t either taken advantage of the great opportunity to leverage our growth or conversely saved wisely for a more austere future.

One of the most important decisions that the growth or austerity alternative futures will drive is our willingness to invest in our personal wellbeing through healthcare.  We are planning our society around increasing healthcare costs and actively thinking about rationing schemes around the world.  But what happens if the human body simply becomes a technology problem which is solved.

It is very hard to look past humans living for 80 to 100 years.  While recent centuries have seen a substantial increase in our life expectancy, it hasn’t been transformational.  Rather more people have enjoyed a healthier life towards the higher end of the expected lifespan.

Science, however, could be on the verge of breaking through the current barriers of our biology with an almost unimaginable impact on our society.

Reinventing our work lives

Futurists of the 1960s and 1970s expected the twenty-first century to be challenged by a lack of employment.  While the first decades show no sign of realising this prediction, they could still be right.  The second generation of artificial intelligence could finally achieve this vision.

Such a society could be extremely wealthy and choose to put the interests of its people first by sharing the opportunity to contribute while rewarding outcomes over effort.  Conversely, the wealth could easily get concentrated in the hands of a few with little opportunity for those without a role to enjoy the spoils.

Reinventing our diets

Finally, even what we eat is likely to change in ways that seem unimaginable.  Most futurists agree that an increasingly wealthy world will create a huge demand for protein, primarily through meat which requires an enormous amount of land to produce.

An alternative now seems likely.  Technology which can produce meat independently of any animal through stem cells is nearing maturity.  Such a product could be indistinguishable from meat produced from a slaughtered animal.  The change this would cause in agribusiness will be of the same magnitude as the digital disruption so many industries are already experiencing.

Maybe this is the carcass we can’t look past today.  Rather than a horse, this carcass belongs to cows, sheep and poultry.

Category: Enterprise2.0
No Comments »

by: Jonathan
26  Mar  2015

5 Challenges facing the Internet of Things

Our constant need to be connected has expanded beyond smartphones and tablets into a wider network of interconnected objects. These objects, often referred to as the Internet of Things (IoT), have the ability to communicate with other devices and are constantly connected to the internet in order to record, store and exchange data.

This idea of an “always on, always connected” device does seem a little big brother-ish, but there are definitely some benefits that come with it. For example, we are already seeing smarter thermostats, like Nest, that allow us to remotely control the temperature of our homes. We also have appliances and cars with internet connectivity, that can learn our behavior, and act on their own to provide us with greater functionality. However, while this is an accelerating trend with already many objects on the market, there are still a number of challenges facing IoT, which will continue to hinder its progress and widespread adoption.

Security

It seems as if every discussion surrounding networks and the internet is always followed by a discussion on security. Given the recent publicity of damaging security breaches at major corporations, it’s hard to turn a blind eye to the dangers of more advanced cyber attacks. There’s no hiding the fact that the introduction of IoT will create a number of additional vulnerabilities that’ll need to be protected. Otherwise, these devices will simply turn into easy access points for cyber criminals. Given that IoT is new technology, there aren’t a lot of security options designed specifically for them. Furthermore, the diversity in device types makes uniform solutions very difficult. Until we see greater security measures and programs designed to handle IoT devices, many will remain hesitant to adopt them for personal and professional use.

Privacy

On the coattails of security comes privacy. One of bigger debates in this age of data is who actually owns the data being created. Is it the users of these devices, the manufacturers, or those who operate the networks. Right now, there’s no clear answer. Regardless, while we are left arguing who owns what information, these devices are tracking how we use them. Your car knows which route you take to work, and your home knows what temperature you prefer in the mornings. In addition, when you consider that almost everything requires an online profile to operate these days, there can be a tremendous amount of private information available to many different organizations. For all we know, our televisions are watching us as we watch our favorite shows, and sending that information to media companies.

Interoperability

In order to create a pure, interconnected IoT ecosystem, there needs to be a seamless experience between different devices. Currently, we haven’t yet achieved that level of interoperability. The problem is that there are so many different makes and models, it’s incredibly difficult to create an IoT system with horizontal platforms that are communicable, operable, and programmable. Right now, IoT communication is fragmented, and many devices are still not able to ‘talk’ with one another. Manufacturers will need to start playing nice with each other, and create devices that are willing to work with competitors.

WAN Capacity

Existing Wide Area Networks (WAN) have been built for moderate-bandwidth requirements capable of handling current device needs. However, the rapid introduction of new devices will dramatically increase WAN traffic, which could strangle entreprise bandwidth. With the growing popularity of Bring Your Own Device policies, people will begin using IoT devices at work, forcing companies to make the necessary upgrades, or suffer crawling speeds and weakened productivity.

Big Data

IoT technology will benefit and simplify many aspects of our lives, but these devices serve a dual purpose, benefiting organizations hungry for information. We live in an era of big data, where organizations are looking to collect information from as many sources as possible in the hopes of learning more about customers and markets. IoT technology will greatly expand the possibilities of data collection. However, the problem then becomes managing this avalanche of data. Storage issues aside, we’ve only just developed improved ways of handling big data analytics, but technologies and platforms will need to further evolve to handle additional demands.

Category: Web2.0
No Comments »

by: RickDelgado
24  Mar  2015

The Debate Continues: The Future Impact of Net Neutrality on the Cloud

The debate over Net Neutrality is far from over. While the recent ruling by the FCC to classify broadband internet as a public utility may have changed the argument, debates will undoubtedly still continue to take place. The effects the decision has on the web will likely not be felt, let alone understood, for many years to come, but that hasn’t stopped speculation over what a neutral internet will actually look like and how companies and internet service providers (ISPs) will be impacted. At the same time, the future of cloud computing has become a hot topic as experts debate if Net Neutrality will be a boost to cloud providers or if the overall effect will be negative. Looking at the current evidence and what many providers, companies, and experts are saying, the only thing that’s clear is that few people can agree on what Net Neutrality will mean for the cloud and all the advantages of cloud computing.

The basic idea of Net Neutrality is, in the simplest of terms, to treat all internet traffic the same. Whether from a small niche social site or a major online retail hub, content would be delivered equally. This sounds perfectly reasonable on the surface, but critics of the Net Neutrality concept say all websites simply aren’t equal. Sites like Netflix and YouTube (mainly video streaming sites) eat up large amounts of bandwidth when compared to the rest of the internet, and as streaming sites grow in popularity, they keep eating up more and more web resources. The theory goes that ISPs would provide internet “fast lanes” to those sites willing to pay the fee, giving them more bandwidth in comparison to other sites, which would be stuck in “slow lanes.” It’s this idea that proponents of Net Neutrality want to guard against, and it’s one of the biggest points of contention in the debate.

Obviously, this is a simplified view of Net Neutrality, but it’s a good background when looking at the effect the new ruling could have on cloud computing. First, let’s take a look at how cloud providers may be affected without a neutral internet. Supporters of Net Neutrality say a “fast lane” solution would represent an artificial competitive advantage for those sites with the resources to pay for it. That could mean a lack of innovation on the part of cloud vendors as they spend added funds to get their data moved more quickly while getting a leg up on their competition. A non-neutral internet may also slow cloud adoption among smaller businesses. If a cloud software provider has to pay more for fast lanes, those costs can easily be passed on to the consumer, which would raise the barrier to cloud use. The result may be declining cloud adoption rates, or at the least performance of cloud-based software may degrade.

On the other side of the coin, critics of Net Neutrality say the effect of the policy will end up damaging cloud computing providers. They’re quick to point out that innovation on the internet has been rampant without new government regulations, and that ISPs could easily develop other innovative solutions besides the “fast lane” approach Net Neutrality supporters are so afraid of. Government rules can also be complicated and, in the case of highly technical fields, would need to be constantly updated as new technology is developed. This may give larger companies and cloud providers an advantage over their competition since they would have the resources to devote to lobbyists and bigger legal budgets to dedicate to understanding new rules. There’s also the concern over getting the government involved in the control of pricing and profits in the first place. Needless to say, many aren’t comfortable with giving that level of control to a large bureaucracy and would rather let market freedom take hold.

Some may say that with the new FCC ruling, these arguments don’t apply anymore, but changes and legal challenges will likely keep this debate lively for the foreseeable future. Will Net Neutrality lead to government meddling in cloud provider pricing and contracts? Will a lack of Net Neutrality slow down cloud adoption and give too much power to ISPs? Unfortunately, there’s no way of knowing the far-reaching consequences of the decision on the cloud computing landscape. It could end up having very little impact in the long run, but for now, it appears Net Neutrality will become a reality. Whether that’s a good or bad thing for the cloud remains to be seen.

 

Category: Enterprise Data Management
No Comments »

Calendar
Collapse Expand Close
TODAY: Mon, May 25, 2015
May2015
SMTWTFS
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456
Archives
Collapse Expand Close
Recent Comments
Collapse Expand Close