Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.
by: Bsomich
28  Jun  2015

MIKE2.0 Community Update

Interested in better data management? See what’s been happening in the MIKE2.0 community this month:

 

  
 logo.jpg

Have you seen our Open MIKE Series? 

The Open MIKE Podcast is a video podcast show which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

You can scroll through the Open MIKE Podcast episodes below:

For more information on MIKE2.0 or how to get involved with our online community, please visit www.openmethodology.org.

Sincerely,

MIKE2.0 Community  

Contribute to MIKE:Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
 images.jpg

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.

 

This Week’s Blogs for Thought:

Key Considerations for a Big Data StrategyMost companies by now understand the inherent value found in big data. With more information at their fingertips, they can make better decisions regarding their businesses. That’s what makes the collection and analysis of big data so important today. Any company that doesn’t see the advantages that big data brings may quickly find themselves falling behind their competitors.

Read more.

The Internet was a mistake. Now let’s fix it. 

Each generation over the last century has seen new technologies that become so embedded in their lives that its absence would be unimaginable.  Early in the 20th century it was radio, which quickly become the entertainment of choice, then television, video and over the past two decades it has been the Internet. For the generation who straddles the implementation of each, there have been format and governance debates which are quickly forgotten.  Today, few remember the colour television format choice every country made between NTSC and PAL just as anyone who bought a video recorder in the early 1980s had to choose between VHS and Beta.

Read more.

Flash Quiety Taking Over Disk in a Big Data World

Right now, we live in the big data era. What was once looked at as a future trend is now very much our present reality. Businesses and organizations of all shapes and sizes have embraced big data as a way to improve their operations and find solutions to longstanding problems. It’s almost impossible to overstate just how much big data has impacted the world in such a short amount of time, affecting everyone’s life whether we truly comprehend how.

Read more.

Forward to a Friend!Know someone who might be interested in joining the Mike2.0 Community? Forward this message to a friend

Questions?

If you have any questions, please email us at mike2@openmethodology.org

 

 

Category: Information Development
No Comments »

by: Robert.hillard
20  Jun  2015

The Internet was a mistake, now let’s fix it

Each generation over the last century has seen new technologies that become so embedded in their lives that its absence would be unimaginable.  Early in the 20th century it was radio, which quickly become the entertainment of choice, then television, video and over the past two decades it has been the Internet.

For the generation who straddles the implementation of each, there have been format and governance debates which are quickly forgotten.  Today, few remember the colour television format choice every country made between NTSC and PAL just as anyone who bought a video recorder in the early 1980s had to choose between VHS and Beta.

It is ironic that arguably the biggest of these technologies, the Internet, has been the subject of the least debate on the approach to governance, standards and implementation technology.

Just imagine a world where the Internet hadn’t evolved in the way it did.  Arguably the connectivity that underpins the Internet was inevitable.  However, the decision to arbitrarily open-up an academic network to commercial applications undermined well progressed private sector offerings such as AOL and Microsoft’s MSN.

That decision changed everything and I think it was a mistake.

While the private sector offerings were fragmented, they were well governed and with responsible owners.

Early proponents of the Internet dreamed of a virtual world free of any government constraints.  Perhaps they were influenced by the end of the Cold War.  Perhaps they were idealists.  Either way, the dream of a virtual utopia has turned into an online nightmare which every parent knows isn’t safe for their children.

Free or unregulated?

The perception that the Internet is somehow free, in a way that traditional communications and sources of information are not, is misguided.

Librarians have long had extraordinary codes of conduct to protect the identity of borrowers from government eyes.  Compare that to the obligation in many countries to track metadata and the access that police, security agencies and courts have to the online search history of suspects.

Telephone networks have always been open to tapping, but the closed nature of the architecture meant that those points are governed and largely under the supervision of governments and courts.  Compare that to the Internet which does theoretically allow individuals to communicate confidentially with little chance of interception but only if you are one of the privileged few with adequate technical skill.  The majority of people, though, have to just assume that every communication, voice, text or video is open to intercept.

Time for regulation

We need government in the real world and we should look for it on the Internet.

The fact that it is dangerous to connect devices directly to the internet without firewalls and virus protection is a failure of every one of us who is involved in the technology profession.  The impact of the unregulated Internet on our children and the most vulnerable in our society reflects poorly on our whole generation.

It is time for the Internet to be properly regulated.  There is just too much risk and (poor) regulation is being put in place by stealth anyway.  Proper regulation and security would add a layer of protection for all users.  It wouldn’t remove all risk, but even the humble telephone has long been used as a vehicle for scams, however remedies have been easier to achieve and law enforcement more structured.

The ideal of the Internet as a vehicle of free expression need not be lost and in fact can be enhanced by ethically motivated governance with the principal of free speech at its core.

Net neutrality is a myth

Increasing the argument for regulation is the reality of the technology behind the Internet.  Most users assume the Internet is a genuinely flat virtual universe where everyone is equal.  In reality, the technology of the Internet is nowhere near the hyperbole.  Net neutrality is a myth and we are still very dependent on what the Internet Service Providers (ISPs) or telecommunications companies do from an architecture perspective (see The architecture after cloud).

Because the Internet is not neutral, there are winners and losers just as there are in the real world.  The lack of regulation means that they come up with their own deals and it is simply too complicated for consumers to work out what it all means for them.

Regulation can solve the big issues

The absence of real government regulation of the Internet is resulting in a “Wild West” and an almost vigilante response.  There is every probability that current encryption techniques will be cracked in years to come, making it dangerous to transmit information that could be embarrassing in the future.  This is leading to investment in approaches such as quantum cryptography.

In fact, with government regulation and support, mathematically secure communication is eminently possible.  Crypto theory says that a truly random key that is as long as the message being sent cannot be broken without a copy of the key.  Imagine a world where telecommunication providers working under appropriate regulations issued physical media similar to passports containing sufficient random digital keys to transmit all of the sensitive information a household would share in a year or even a decade.

We would effectively be returning to the model of traditional phone services where telecommunication companies managed the confidentiality of the transmission and government agencies could tap the conversations with appropriate (and properly regulated) court supervision.

Similarly, we would be mirroring the existing television and film model of rating all content on the Internet allowing us to choose what we want to bring into our homes and offices.  Volume is no challenge with an army of volunteers out there to help regulators.

Any jurisdiction can start

Proper regulation of the internet does not need to wait for international consensus.  Any one country can kick things off with almost immediate benefit.  As soon as sufficient content is brought into line, residents of that country will show more trust towards local providers which will naturally keep a larger share of commerce within their domestic economy.

If it is a moderately large economy then the lure of frictionless access to these consumers will encourage international content providers to also fall into line given the cost of compliance is likely to be negligible.  As soon as that happens, international consumers will see the advantage of using this country’s standards as a proxy for trust.

Very quickly it is also likely that formal regulation in one country will be leveraged by governments in others.  The first mover might even create a home-grown industry of regulation as well as supporting processes and technology for export!

Category: Information Governance
No Comments »

by: Jonathan
19  Jun  2015

Key Considerations for a Big Data Strategy

Most companies by now understand the inherent value found in big data. With more information at their fingertips, they can make better decisions regarding their businesses. That’s what makes the collection and analysis of big data so important today. Any company that doesn’t see the advantages that big data brings may quickly find themselves falling behind their competitors. To benefit even more from big data, many companies are employing big data strategies. They see that it is not enough to simply have the data at hand; it must be utilized in the most effective manner to maximize its potential. Coming up with the best big data strategy, however, can be difficult, especially since every organization has different needs, goals, and resources. When creating a big data strategy, it’s important for companies to consider several main issues that can greatly affect its implementation.

When first developing a big data strategy, businesses will need to look at the current company culture and change it if necessary. This essentially means to encourage employees throughout the whole organization to get into the spirit of embracing big data. That includes people on the business side of things along with those in the IT department. Big data can change the way things are done, and those who are resistant to those changes could be holding the company back. For that reason, they should be encouraged to be more open about the effect of big data and ready to accept any changes that come about. Organizations should also encourage their employees to be creative with their big data solutions, basically fostering an atmosphere of experimentation while being willing to take more risks.

As valuable as big data can be, simply collecting it for the sake of collecting big data will often result in failure. Every big data strategy needs to account for specific business objectives and goals. By identifying precisely what they want to do with their data, companies can enact a strategy that drives toward that single objective. This makes the strategy more effective, allowing organizations to avoid wasting money and resources on efforts that won’t benefit the company. Knowing the business objectives of a big data strategy also helps companies identify what data sources to focus on and what sources to steer clear from.

It’s the value that big data brings to an organization that makes it so crucial to properly use it. When creating a big data strategy, businesses need to make sure they view big data as a company-wide asset, one which everyone can use and take advantage of. Too often big data is seen as something meant solely for the IT department, but it can, in fact, benefit the organization as a whole. Big data shouldn’t be exclusive to only one group within a company. On the contrary, the more departments and groups can use it, the more valuable it becomes. That’s why big data strategies need a bigger vision for how data can be used, looking ahead to the long-term and avoiding narrowly-defined plans. This allows companies to dedicate more money and resources toward using big data, which helps them to innovate and use it to create new opportunities.

Another point all organizations need to consider is the kind of talent present in their companies. Data scientists are sought by businesses the world over because they can provide a significant boost to accomplishing established big data business goals. Data scientists are different from data analysts since they can actually build new data models, whereas analysts can only use models that have been pre-made. As part of a big data strategy, the roles and responsibilities of data scientists need to be properly defined, giving them the opportunity to help the organization achieve the stated business objectives. Finding a good data scientist with skills involving big data platforms and ad hoc analysis that are appropriate for the industry can be difficult with demand so high, but the value they can add is well worth it.

An organized and thoughtful big data strategy can often mean the difference between successful use of big data and a lot of wasted time, effort, and resources. Companies have a number of key considerations to account for when crafting their own strategies, but with the right mindset, they’ll know they have the right plans in place. Only then can they truly gain value from big data and propel their businesses forward.

Category: Business Intelligence
No Comments »

by: RickDelgado
18  Jun  2015

Flash Quietly Taking Over Disk in a Big Data World

Right now, we live in the big data era. What was once looked at as a future trend is now very much our present reality. Businesses and organizations of all shapes and sizes have embraced big data as a way to improve their operations and find solutions to longstanding problems. It’s almost impossible to overstate just how much big data has impacted the world in such a short amount of time, affecting everyone’s life whether we truly comprehend how. That means we live in a world awash in data, and as companies pursue their own big data strategies, they’ve had to rethink how to store all that information. Traditional techniques have proven unable to handle the huge amount of data being generated and collected on a daily basis. What once was dominated by hard disk drives (HDD) is now rapidly changing into a world driven by solid-state drives (SSD), otherwise known as flash storage.

For years, when talking of big data analytics, the assumption was that a business was using disk. There were several reasons for this, the main one being cost. Hard disk drives were simply cheaper, and for the most part they could deal with the increasing workloads placed upon them. The more data measured and generated, however, the more the limitations of HDD were unmasked. This new big data world needed a storage system capable of handling the workload, and thus the migration to flash storage began.

Many, including Gartner, peg 2013 as the year the switch really gained steam. Solid-state arrays had already been a storage strategy up until then, but in 2013 flash storage manufacturers began constructing arrays with new features like thin provisioning, deduplication, and compression. Suddenly, the benefits gained from using flash storage outweighed some of the drawbacks, most notably the higher cost. In a single year, solid-state arrays saw a surge in sales, increasing by more than 180 percent from 2012. With the arrival of flash storage to the mainstream, organizations could begin to replace their hard disk drives with a system more capable of processing big data.

And that’s really a main reason why flash storage has caught on so quickly. SSDs provide a much higher performance than the traditional storage options. Of particular note is the reduction in the time it takes to process data. Just one example of this is the experience from the Coca-Cola Bottling Co., which began collecting big data but was soon met by long delays in production due to having to sort through loads of new information. When the company adopted flash storage solutions, the amount of time needed to process data was cut dramatically. For example, processing jobs taking 45 minutes now only took six. These kind of results aren’t unique, which is why so many other businesses are seeking flash storage as their primary means of storing big data.

Many tech companies are responding to this increased demand by offering up more options in flash storage. SanDisk has recently unveiled new flash systems specifically intended to help organizations with their efforts in big data analytics. The new offerings are meant to be an alternative to the tiered storage often seen in data centers. Other major tech companies, such as Dell, Intel, and IBM, have shown similar support for flash storage, indicating the lucrative nature of offering flash solutions. The growth isn’t just being driven by private companies either; educational institutions have found a need for flash storage as well. MIT researchers announced last year that they would be switching from hard disks to flash storage in order to handle the demands of big data more effectively. The researchers determined that hard disk drives were too slow, so a better performing storage solution was needed.

As can be seen, flash storage has been quietly but surely taking over hard disk’s turf. That doesn’t mean hard disk drives will soon be gone for good. HDD will likely still be used for offline storage — mainly archiving purposes for data that doesn’t need to be accessed regularly. But it’s clear we’re moving into a world where solid-state drives are the most prevalent form of storage. The need to collect and process big data is making that happen, providing new, unique opportunities for all kinds of organizations out there.

Category: Enterprise Data Management
No Comments »

by: Robert.hillard
29  May  2015

Who are we leaving behind?

I was recently invited to deliver the keynote address at the University of Melbourne engineering and IT awards night.  I took the opportunity to challenge today’s students to think about the people being left behind as in the move to a digital economy.

Some 25 years ago I had the privilege of attending this university.  In thinking about tonight and the achievements of so many of you, I couldn’t help reflecting on the challenges my generation faced entering professional life in the 1990s and comparing them to what you will see in the decades ahead.

Wherever you turn in the media you are hearing the term “digital disruption”.  For those of us lucky enough to be educated in the so-called STEM disciplines of Science, Technology, Engineering and Mathematics, we probably feel empowered and excited by the talk of “digital disruption”.  How could we not look forward to using the technology of our time to replace the cars we drive with electric vehicles, our plastic credit cards with mobile wallets or the work cubicle of my generation with working flexibly at a local café.

We must not forget that the same sense of optimism is not necessarily true for societies around the world.  At a macro-level, where there is technological disruption, business disruption follows.  But where there is business disruption, social disruption is all too often the result.

Already email has all-but killed traditional mail.  Mainstream bookstores are a shadow of their former selves.  I can’t remember the last time I was in a record or CD store.  Hardly any businesses worry about advertising in the telephone directory.  Each of these changes has cost jobs and not everyone who lost out has found a role in the changing economy.

We have a responsibility to make sure that change is not only good for those of us lucky enough to have access to the right skills, but also to make change good for society as a whole.  Future generations will judge us by how we navigate the next two decades.

Change leaves winners and losers

We should be under no illusion, the changes we are going to go through in transport, energy and, particularly, financial services will leave far more at a disadvantage than any change that we have seen so far in the era of the Internet.

Does it matter that the next wave of innovation could see the end of local media content rules?  Does it matter that Department stores could be wiped out in Australia?  Does it matter that banks could be taken to the brink by a wave of fintech innovators, including peer-to-peer lending?

Take the last of these; replacing banks sounds an awful lot like risking the failure of the banks.  Whether it was in the nineteenth century, twentieth century or most recently in the GFC in this century, whenever banks have gone down many ordinary people have been badly hurt.

Disruption spurred-on by digital technology is extending into mainstream engineering, for example, batteries are looking to finally fill the gap of solar energy, providing stored base load power for when the sun isn’t shining and electric vehicles that hardly need servicing.  We could be just a few years away from houses going off-grid en masse.  These changes will leave car dealerships without a source of service income and power utilities without a market.

The concentration of wealth

The birth of the commercial Internet offered an opportunity for small business to compete with large companies with many of the advantages of scale and geography being removed.  At the turn of the century we saw an opportunity for a utopia of innovation spread across the globe with the rewards reaching far more people than ever before.

20 years in and the reality is not entirely aligned to this vision.  The network effect means that there are advantages to scale.  The more people that use the same search engine, the better its algorithms.  The more people that use the same music service the better its catalogue.  The more people that use the same social network, the greater its reach.

However, the evidence is that the groundswell of innovation is turning the tide with new money pouring into start-ups regardless of location and value being added in more locations that ever before.  That is where this room can lift its sights.  Seek to add value to the local economy where you and your family want to live and where you can see a society that you want to build.  Don’t be afraid to resist the pull to traditional centres of innovation. Don’t just think of your personal wealth but also of where your effort will contribute to your community.

Using technology as part of the solution

We in this room have the ability to channel our knowledge, skills and innovative flair to not only develop new applications of technology but also to corral and encourage the application of technology in such a way as to minimise unintended consequences and potential achieve new benefits for our society.

We can choose to enable a sharing economy, created through yet more innovations such as 99designs, Kaggle, Freelancer, Airbnb, DriveMyCar, and so many more, which provide income to a wider range of people using their available talent and resources.

We can choose to support local talent through the development of great ideas and then seeing them through to commercial success.  We can use automation and 3D printing to enable local manufacturing.  We can develop specialised services which are ready to be purchased by government and industry so that they are less dependent on imports.

I refuse to believe that we can’t use technology to improve access to capital while maintaining a safe financial system.  I am convinced that we can find better ways to access products and services without doing away with storefronts.  I know that we can make the move from fossil fuels to renewables while keeping a highly skilled engineering capability locally.

In conclusion

You will face different professional challenges to those that I faced as I approached the end of my education at this university.  You will be the shapers of society in the decades ahead.  You will help decide whether to throw ourselves headlong into technology-driven disruption or whether to keep a watch out for those who are left behind. You will decide whether you will be drawn into ever greater geographic concentration of innovation or if you will take the path to keeping value in the community you want to be part of.

I hope you will seek to make the right choices with the education you have worked so hard to earn.

Thank-you.

Category: Enterprise2.0
No Comments »

by: Bsomich
18  May  2015

MIKE2.0 Community Update

Click to view this email in a browser

 

 

Missed what’s been happening in the MIKE2.0 data management community? Check out our latest update:

 logo.jpg

How Do You Define Your Master Data? 

There are numerous definitions for “master data” ranging from one sentence to a few paragraphs.  This is perhaps the most straightforward one I’ve come across:

Master data is the core data that is essential to operations in a specific business or business unit. - via Whatis.com

A clear and simple definition, yet a lot of companies often struggle to adhere to it when identifying and qualifying master data for their organizations.

Why do you think this is?

Although data is often looked at on a transactional basis, master data typically makes up a large a percentage of the data elements in any given transaction. Common examples of master data include:

  • Customer data (name, contact details, DOB, customer classification)
  • Locality data (physical address, postal address, geographical data)
  • Product data (item number, bill of materials, product codes)
  • Employee data (employee number, role, placement in organisational structure)
  • Partner data (partner name, classification)

It is not unusual for this same data to be held in dozens or even hundreds of applications across a large organization, and may be difficult to isolate and collect.   Much of the data has been held in legacy systems for years and may be held in a fashion where data is poorly integrated and at low levels of quality.  Many organizations have poorly implemented Data Governance processes to handle changes in this data over time.

MIKE2.0 offers an open source solution for managing master data that outlines many of the issues organizations have with identifying it.

We hope you find this offering of benefit and welcome any suggestions you may have to improve it.

Sincerely,

MIKE2.0 Community

Popular Content

Did you know that the following wiki articles are most popular on Google? Check them out, and feel free to edit or expand them!

What is MIKE2.0?
Deliverable Templates
The 5 Phases of MIKE2.0
Overall Task List
Business Assessment Blueprint
SAFE Architecture
Information Governance Solution

Contribute to MIKE:

Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
 images.jpg

 

This Week’s Blogs for Thought:

5 Unusual Ways Businesses are Using Big Data

Big data is where it’s at. At least, that’s what we’ve been told. So it should come as no surprise that businesses are busy imagining ways they can take advantage of big data analytics to grow their companies. Many of these uses are fairly well documented, like improving marketing efforts, or gaining a better understanding of their customers, or even figuring out better ways to detect and prevent fraud. The most common big data use cases have become an important part of industries the world over, but big data can be used for much more than that. In fact, many companies out there have come up with creative and unusual uses for big data analytics, showing just how versatile and helpful big data can be.

Read more.

Cloud Computing and the Industries that Love It

Cloud computing provides greater security, virtually unlimited computing resources for research and development, cost savings, and advanced threat detection methods. With so many reasons to use cloud computing, it’s no wonder many industries have flocked to the new technology. Cloud technology serves as a form of outsourcing for companies, where some data is kept in house for better control, and other data is trusted to a third-party provider. Each industry that benefits from cloud computing has their own specific reasons for adopting the technology, but cloud computing is most profitable for companies that work with emerging markets and need quick and cost effective scalability.

Read more.
Is Your Data Quality Boring? 

Let’s be honest here. Data Quality is good and worthy, but it can be a pretty dull affair at times. Information Management is something that “just happens”, and folks would rather not know the ins-and-outs of how the monthly Management Pack gets created. Yet I’ll bet that they’ll be right on your case when the numbers are “wrong.” Right?

So here’s an idea. The next time you want to engage someone in a discussion about data quality, don’t start by discussing data quality. Don’t mention the processes of profiling, validating or cleansing data. Don’t talk about integration, storage or reporting. And don’t even think about metadata, lineage or auditability.

Read more.

Forward this message to a friendQuestions?

If you have any questions, please email us at mike2@openmethodology.org. 

 

Category: Information Development
No Comments »

by: Bsomich
12  May  2015

Defining Master Data

There are numerous definitions for “master data” ranging from one sentence to a few paragraphs.  This is perhaps the most straightforward one I’ve come across:

Master data is the core data that is essential to operations in a specific business or business unit. - via Whatis.com

A clear and simple definition, yet a lot of companies often struggle to adhere to it when identifying and qualifying master data for their organizations.

Why do you think this is?

Although data is often looked at on a transactional basis, master data typically makes up a large a percentage of the data elements in any given transaction. Common examples of master data include:

  • Customer data (name, contact details, DOB, customer classification)
  • Locality data (physical address, postal address, geographical data)
  • Product data (item number, bill of materials, product codes)
  • Employee data (employee number, role, placement in organisational structure)
  • Partner data (partner name, classification)

It is not unusual for this same data to be held in dozens or even hundreds of applications across a large organization, and may be difficult to isolate and collect.   Much of the data has been held in legacy systems for years and may be held in a fashion where data is poorly integrated and at low levels of quality.  Many organizations have poorly implemented Data Governance processes to handle changes in this data over time.

MIKE2.0 offers an open source solution for managing master data that outlines many of the issues organizations have with identifying it.

How do you define and qualify your master data?

Category: Information Development
No Comments »

by: Robert.hillard
25  Apr  2015

Experts make better decisions with an understudy

Most of us are experts at something.  An expert is someone who can reliably assess a situation and apply an appropriate advanced skill or technique.  Knowing what skill to apply and when is just as important as the technical capability that is applied.  Examples include medical specialists deciding whether to operate and, if so, how.

Knowing what skill to apply requires data.  For doctors, this is usually in the form of symptoms, for accountants, it’s the financial results and for engineers it’s the telemetry that is generated by almost all of the infrastructure that now surrounds us.

Understanding how we use data is really important.  Knowledge Management experts talk about tacit versus explicit knowledge.  The former is often hard to document or clearly communicate.  Yet, tacit does not imply that it is not based on data, but it is often using a complex combination of the facts at hand combined with the experience of the practitioner.

Even the best knowledge systems can’t match the interpretation of the data that the tacit knowledge of experts can achieve.  Although Big Data analytics solutions are making good progress they can’t make the sort of expert cognitive leaps that we rely on for some of our most critical decisions (see Your insight might save your job).  It’s going to be a while before our General Practitioner is replaced by a computer.

But how good are the decisions that experts make?  If the interpretation of the results is unambiguous then it is likely that an alert and capable expert will make the right decision.  Their choices can be validated by a second-in-charge such as a co-pilot in an airplane cockpit with a consensus almost certain.  But these are the sorts of decisions that are most at risk of automation.  What about those decisions that are dealing with imperfect data, ambiguous symptoms or the convergence of apparently unrelated issues?

Teaching makes us better experts

When we teach we challenge ourselves.  Many years ago, I had my first opportunity to teach students in my own discipline of data management.  At that point in my career I was already considered an expert and I was very used to delivering expert advice to clients.

What changed when I had to teach was the need to provide evidence and references.  In doing so, I was forced to critically examine my decision making process.  While my overall approach didn’t change, I found myself being more formal in the way I referenced my client work and I tried to not only satisfy my clients but also consider what my students would ask.

Open talent

There is a lot of talk around open talent models. With the likely result that organisations can access the global expert who can answer their specific question.  This is happening across the board including disciplines such as management consulting, engineering, accounting, law and even medicine.

For many tasks this makes perfect sense.  An expert who can review the data and provide a specific answer, recommendation or diagnosis is incredibly valuable.  With social networks, finding such an expert is sometimes only a few clicks away even for the most obscure but specific facts to be reviewed.

I would argue, however, that if the expert is simply providing an answer to a specific question, then ultimately the expert’s role will be automated in the future.  Not only do these sorts of experts face redundancy through automation but even when they are using their skills to provide insight they are operating in a vacuum.  Their ideas go largely unchallenged and are not developed further.

The value of mentoring

Compare that situation to a practitioner who is working with a younger group who they are mentoring or teaching.  The questions they will be asked force them to evaluate their whole approach and, on occasion change their view.

This is the reason why teaching and research go hand-in-hand.  It isn’t only the labour capacity that students and junior staff provide, it is also the perspective that they either bring to the table or that they trigger in their supervisors.

In my own field of Management Consulting, this is the most important function of graduates and junior staff.  They offer a refreshing perspective.  They assume that there are no dumb questions and are eager to learn.  In their eagerness, they don’t hesitate to question established orthodox perspectives.

This is the reason I am an optimist about the future for so many of our professions.  Despite the threat of automation and the enthusiasm for offshoring to a few experts, the really good decisions are usually made by experts who are surrounded by teams who are eager to learn from them.  There will be a role for this staffing model for many years to come.

Category: Enterprise2.0
No Comments »

by: Jonathan
17  Apr  2015

5 Unusual Ways Businesses are Using Big Data

Big data is where it’s at. At least, that’s what we’ve been told. So it should come as no surprise that businesses are busy imagining ways they can take advantage of big data analytics to grow their companies. Many of these uses are fairly well documented, like improving marketing efforts, or gaining a better understanding of their customers, or even figuring out better ways to detect and prevent fraud. The most common big data use cases have become an important part of industries the world over, but big data can be used for much more than that. In fact, many companies out there have come up with creative and unusual uses for big data analytics, showing just how versatile and helpful big data can be.

1. Parking Lot Analytics

Every business is trying to gauge how well they are doing, and big data is an important part of that. Perhaps some study the data that comes from their websites, or others look at how effective their marketing campaigns are. But can businesses measure their success by studying their parking lots? One startup is doing that very thing. Using satellite imagery and machine learning techniques, Orbital Insight is working with dozens of retail chains to analyze parking lots. From this data, the startup says it can assess the performance of each company without needing further information. Their algorithm uses deep learning to delve into the numbers and find unique insights.

2. Dating Driven By Data

Big data is changing the way people date. Many dating websites, like eHarmony, use the data they compile on their users to come up with better matches, increasing the odds they’ll find someone they’re compatible with. With open source tools like Hadoop, dating sites can gain detailed data on users through answers to personal questions as well as through behaviors and actions taken on the site. As dating sites collect more data on their customers, they’ll be able to more accurately predict who matches well with whom.

3. Data at the Australian Open

Many sports have adopted big data to get a better understanding of their respective games, but big data is also being used in a business sense in the sports world. The Australian Open relies heavily on big data during the tournament in response to the demands of tennis fans around the world. With big data, they can optimize tournament schedules and analyze information like social media conversations and player popularity. From there, the data is used to predict viewing demands on the tournament’s website, helping organizers determine how much computing power they need at any given time.

4. Dynamic Ticket Pricing

The NFL is also using big data analytics to boost their business. While it might seem like the NFL doesn’t need help in this regard, they still want to use big data to increase ticket sales. The goal is to institute variable ticket pricing, which has already been implemented by some teams. Using big data, NFL teams can determine the level of demand for specific games based on factors like where it falls in the season, who the opponent is, and how well the home team is playing. If it’s determined demand is high, ticket prices will go up. If demand is predicted to be low, prices will go down, hopefully increasing sales. With dynamic ticket pricing, fans wouldn’t have to pay high prices for games that are in low demand, creating more interest in the product, especially if a team is struggling.

5. Ski Resorts and Big Data

Many ski resorts are truly embracing the possibilities of big data. This is done through basic ideas, like saving rental information, but it can also be used to prevent ticket fraud, which can take out a good chunk of revenue. Most impressively is how big data is used to increase customer engagement through the use of gamification. With Radio Frequency Identification (RFID) systems, resorts can actually track skiers, compiling stats like number of runs made, number of feet skied, and how often they get to the slopes. This data can be accessed on a resort’s website where skiers can compete with their friends, earning better rankings and rewards which encourage them to spend more time on the slopes.

These cases show that with a bit of creative thinking, big data can help businesses in more ways than one. As companies become more familiar working with big data, it’s easy to see how unique and innovative solutions will likely become the norm. As unusual as some of these uses may be, they may represent only the beginning of many unique ventures in the future.

 

Category: Business Intelligence
No Comments »

by: RickDelgado
15  Apr  2015

Cloud Computing and the Industries that Love It

Cloud computing provides greater security, virtually unlimited computing resources for research and development, cost savings, and advanced threat detection methods. With so many reasons to use cloud computing, it’s no wonder many industries have flocked to the new technology. Cloud technology serves as a form of outsourcing for companies, where some data is kept in house for better control, and other data is trusted to a third-party provider. Each industry that benefits from cloud computing has their own specific reasons for adopting the technology, but cloud computing is most profitable for companies that work with emerging markets and need quick and cost effective scalability.

The Chemicals Industry

Chemical companies are being driven to improve their flexibility, reduce costs, improve speed and become more responsive. Cloud computing provides this by transforming chemical businesses into thriving, cloud-based digital businesses. Chemical companies must be prepared to penetrate new markets quickly. Higher speeds and greater visibility is also a continually evolving need in this industry as collaboration becomes more important than ever.

As governments continue to push for green legislation, chemical companies often find themselves in the crosshairs of local and federal government regulating industries. Switching to cloud-based providers offers one way to increase accountability and reduce resource consumption. Additionally, as cost efficiency becomes increasingly important, chemical companies love the fact that cloud computing provides greater operational agility and increased cost savings across the entire industry.

Chemical companies use IaaS and SaaS in an effort to control costs and use virtualization to create private cloud architectures for their businesses. For Example, Dow Chemical met the requirements for 17 European countries by moving its operations to the cloud.

Law Firms

Law firms deal with large amounts of data on a regular basis, and they need to ensure that the data stays safe. In the past, law firms have mainly relied upon in-house servers to manage their operations. As the expense for maintaining the computers, servers, software and hiring IT administrators has grown, cloud computing has become an attractive alternative.

Even the simplest building closure can put a law firm out of reach from its data. This can seriously hamper an attorney’s ability to effectively manage clients and maintain a high level of service. By moving data to the cloud, law firms have become more prepared to deal with disasters and can reduce the possibility of being without crucial data. Lawyers must also increasingly work outside the office to meet with clients and maintain a high level of effectiveness. Cloud computing solves the problem by making the issue of accessing content securely while away from the office a much simpler, and safer endeavor.

There are unique ethical considerations that any law firm must consider when entrusting its data to a third-party. Law firms can maintain control of their data, while still utilizing cloud servers for advanced threat defense, security and applications that are non-sensitive. While law firms must take due diligence and talk specifically with their cloud provider about data center locations, how data is treated, encryption levels, and their duties in the event of a subpoena, the move to the cloud offers a chance for greater efficiency, reliability and cost savings.

Startup Communication Companies

The trend with new startups is to get going quickly, and define the work at a later date. Many startups don’t have a clear mission plan, and rely upon data received from initial product launches to determine the direction a company will take. Startups are in a unique position to be extremely flexible and adaptable. Established companies generally have preferred software applications, complex networking arrangements and a system that requires careful planning before any changes are made to the infrastructure. With a startup, the entire structure of the company can be taken down and rebuilt in a single business day, this makes working in the cloud a dream for new companies.

No industry knows the importance of flexibility more than the communications industry. With new technologies being developed and emerging daily, it’s become increasingly important to have a dynamic and scalable workspace for research and development. The cloud provides an ideal environment for companies that may need terabytes of data one day and a few gigabytes the next. The cloud provides an ideal situation where resources can be effectively managed without having to upgrade hardware, invest in costly data centers and hire several IT administrators to keep things running smoothly.

Government Agencies and Law Enforcement

Government agencies, including the CIA, FBI and local law enforcement are continually evaluating the cloud architecture to determine ways it can be utilized to increase efficiency, manage multiple departments and improve mobility. Governments largely deem that cloud computing is a safe alternative to traditional in-house servers as it provides advanced threat detection and a high degree of security.

Cross-agency cooperation is essential for governments that need to share information on a state and federal level. By keeping information available in the cloud, state agencies can work more effectively with federal authorities. This makes it possible to share information quickly, and improve the ability to stop an advanced threat before it causes harm. Governments can use public cloud services for less critical information, and a private cloud service for the most sensitive data. This provides the best of both worlds.

The Future of Cloud Computing

Any industry that needs a highly secure, adaptable and scalable computing environment can benefit from cloud computing. From the music industry to the local startup that is still defining its purpose, cloud computing can reduce costs, improve efficiency and increase security for any company. As governments continue to impose strict fines and penalties for failing to maintain good security practices, it has become more important than ever to safeguard company and customer information. The cloud does this at a low cost and with great flexibility.

 

Category: Enterprise Data Management
No Comments »

Calendar
Collapse Expand Close
TODAY: Tue, July 7, 2015
July2015
SMTWTFS
2829301234
567891011
12131415161718
19202122232425
2627282930311
Archives
Collapse Expand Close
Recent Comments
Collapse Expand Close