Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Archive for the ‘Enterprise Data Management’ Category

by: Robert.hillard
23  Dec  2016

Who do you love?

An alien relying on TV for their knowledge of humanity might watch a few ads and assume our closest emotional relationships are with banks, utilities and retailers. After all, they all claim to be your best friend, look how many ads talk about “falling in love” with your service provider!

It is popular to talk about the relationship between customers and the businesses that serve them. Banks, airlines and utilities all seek to be best friends with their customers. This is probably understandable given that most of us are passionate about the businesses we work for and we want our customers to be as well.

In building such a relationship, marketers can point to great examples such as airline loyalty schemes, social media and even the account balance page of internet banking sites,. In each case, there are individuals who interact daily, even hourly, with these services and look forward to each touchpoint.

Such a strong relationship is hard for most businesses to maintain with the majority of their customers. After all, most people don’t get excited looking-up their electricity prices, mortgage rate or recent phone numbers they’ve called.

The common attribute of the businesses we care about seems to be the information they provide. Many people can’t imagine why they would care deeply about a bank, yet a small number of people check their bank account balances multiple times in a day. Anecdotally, those repeat checkers are dreaming of a saving goal which provides a halo effect for the bank.

Similarly, many travellers love to track their frequent flyer status which they see as a reward in its own right. The airlines create portals that engage their premium passengers and offer a regular sense of progress and engagement.

Uber has a fascinating screen on its app showing all the cars circling locally while eBay has nailed the search for a bargain. Some fintechs are attracting customers by creating a “fiddle factor”, letting them earn small rewards in different ways.

At the same time, it doesn’t seem that people care too much whether they love their basic services. Most people just want their savings to be safe, their lights to stay on and their phones to ring. The only problem is that in an environment where they can change providers easily, this lack of loyalty means that they are more likely to make a switch.

How can a brand that provides a capability that people need, but lacks passion, align with a brand that everyone cares about? This is the power of the API economy where it is easy for businesses to partner seamlessly.

Banks and airlines were pioneers in partnering, bringing together credit cards and air miles. Similarly, phone companies are partnering with music and movie streamers to dramatically increase engagement with their services. In coming years we can expect to see social media, fashion brands and travel businesses join with the everyday services that meet our basic needs.

To be successful, partners need to make sure they understand what elicits a strong affinity. To-date, brands have largely taken the same approach for all customers. For example, “daily-deal” style retailers are highly attractive to some customers and highly annoying to others. Basic services, such as insurance, who choose to partner with businesses like these need to be very targeted, otherwise they risk alienating as many customers as they delight. Too many marketers have made this mistake and have potentially damaged their brands.

The key to a meaningful relationship is tailoring the partnerships to offer the customer something they genuinely want to engage with. Talking to their customer community and offering them choice is a very good start, giving the winners in the race to pair more opportunities to generate genuine friendship, if not love!

Tags:
Category: Enterprise Data Management, Enterprise2.0, Web2.0
No Comments »

by: Robert.hillard
26  Nov  2016

Serendipity

Information overload is as much an overwhelming feeling as it is a measurable reality. We often feel an impossible obligation to be across everything, which leaves us wanting to give up and absorb nothing that hits our various screens. Despite all this, the good news is that the majority of the information we need seems to appear just in time.

Where does that leave those of us who are control freaks? I am not comfortable to know that the right information will find me the majority of the time. I want to know that the information I need is guaranteed to find me every time!

The trouble is, guarantees are expensive. This is related to the debate between search based big data solutions and enterprise data warehouses. Google provides a “near enough” search solution that, given the massive amount of data it trawls through, usually seems to find what we need. Knowledge and business intelligence solutions provide the predictable information flows but come at a huge cost.

Of course, the real sense of serendipity comes when information arrives unsought just when we need it. It can come through the right article being highlighted in a social media feed, a corporate policy being forwarded or the right coffee conversation with a colleague. Of course, serendipity isn’t random coincidence and there is much we can do to improve the odds of it happening when we need it most.

Before doing so, it is important to know what things have to be predictable and reliable. A list is likely to include financial reports, approvals and other controls. What’s more, a scan of any email inbox is likely to show a significant number of messages that need to be read and often actioned. Despite its tyranny on our working lives, email works too well!

Serendipity depends on the quality of our networks, both in terms of who we know and the amount of activity the passes between the nodes. A good way to understand the power of relationships in an information or social network is through the theory of “small worlds” (see chapter 5 of my book Information-Driven Business).

Ironically, in an era when people talk about electronic isolation, social networks, that is who we know, are more important than ever. Serendipity relies on people who we know, at least vaguely, promoting content in a way that we are likely to see.

Just as control freaks worry about relying on serendipity, those that are more relaxed run the risk of relying too much on information finding its way mysteriously to them at the right time. Those that don’t understand why it works, won’t understand when it won’t work.

Far from making experts and consultants redundant, this increasing trend towards having the right information available when it’s needed is making them more necessary than ever before. The skill experts bring is more than information synthesis, something that artificial intelligence is increasingly good at doing and will become even better at in the near future. The job of experts is to find connections that don’t exist on paper, the cognitive leaps that artificial intelligence can’t achieve (see Your insight might just save your job).

The first thing is to be active posting updates. Networks operate through quid quo pro, in the long-term we get back as much as we give. In the office, we call this gossip. Too much gossip and it just becomes noise but the right amount and you have an effective social network. Those people who only ever silently absorb information from their colleagues quickly become irrelevant to their social circle and gradually get excluded.

The second is to be constantly curious, like a bowerbird searching and collecting shiny pieces of information, without necessarily knowing how they will all fit together. The great thing about our modern systems is that massive amounts of tagged content is easy to search in weeks, months and years to come.

Finally, have some sort of framework or process for handling information exchange and picking a channel based on: criticality (in which case email is still likely to be the best medium), urgency (which favours various forms of messaging for brief exchanges), targeted broadcast (which favours posts explicitly highlighted/copied to individuals) or general information exchange (which favours general posts with curated social networks). Today, this is very much up to each individual to develop for themselves, but we can expect it to be part of the curriculum of future generations of children.

No matter how often it seems to happen, almost by magic, information serendipity is no accident and shouldn’t be left to chance.

Tags: ,
Category: Enterprise Content Management, Enterprise Data Management, Enterprise Search, Enterprise2.0, Information Strategy, Information Value
No Comments »

by: Robert.hillard
29  Aug  2015

Behind the scenes

Before the advances of twentieth century medicines, doctors were often deliberately opaque. They were well known for proscribing remedies for patients that were for little more than placebos. To encourage a patient’s confidence, much of what they wrote was intentionally unintelligible. As medicine has advanced, even as it has gotten more complicated, outcomes for patients are enhanced by increasing their understanding.

In fact, the public love to understand the services and products that they use. Diners love it when restaurants make their kitchens open to view. Not only is it entertaining, it also provides confidence in what’s happening behind the scenes.

As buildings have become smarter and more complex, far from needing to hide the workings, architects have gone in the opposite direction with an increasing number of buildings making their technology a feature. It is popular, and practical, to leave structural supports, plumbing and vents all exposed.

This is a far cry from the world of the 1960s and 1970s when cladding companies tried to make cheap buildings look like they were made of brick or other expensive materials. Today we want more than packaging, we want the genuine article underneath. We want honest architecture, machinery and services that we can understand.

I find it fascinating that so many people choose to wear expensive watches that keep time through mechanical mechanisms when the same function can be achieved through a great looking ten dollar digital watch. I think people are prepared to pay thousands when they believe in the elegance and function of what sits inside the case. Many of these watches actually hint at some of those mechanics with small windows or gaps where you can see spinning cogs.

The turnaround of Apple seemed to start with the iMac, a beautiful machine that had a coloured but transparent case, exposing to the world the workings inside.

Transparent business

So it is with business where there are cheap ways of achieving many goals. New products and services can be inserted into already cluttered offerings and it can all be papered over by a thin veneer of customer service and digital interfaces that try to hide the complexity. These are the equivalent of the ten dollar watch.

I had a recent experience of a business that was not transparent. After six months, I noticed a strange charge had been appearing on my telephone bill. The company listing the charges claimed that somewhere we had agreed to their “special offer”. They could not tell us how we had done it and were happy to refund the charges. The real question, of course, is how many thousands of people never notice and never claim the charges back?

Whether it is government, utilities, banking or retail, our interactions with those that provide us products and services are getting more complex. We can either hide the complexity by putting artificial facades over the top (such as websites with many interfaces) or embrace the complexity through better design. I have previously argued that cognitive analytics, in the form of artificial intelligence would reduce the workforce employed to manage complexity (see Your insight might protect your job) but this will do nothing to improve the customer experience.

Far from making people feel that business is simpler, the use of data through analytics in this way can actually make them feel that they have lost even more control. Increasingly they choose the simpler option such as being a guest on a single purpose website rather than embracing a full service provider that they do not understand.

Good governance

Target in the US had this experience when their data analytics went beyond the expectations of what was acceptable to their customers (see The Incredible Story Of How Target Exposed A Teen Girl’s Pregnancy)

In this age of Big Data, good data governance is an integral part of the customer experience. We are surrounded by more and more things happening that go beyond our expectation. These things can seem to happen as if by magic and lead us to a feeling of losing control in our interactions with businesses.

Just as there is a trend to open factories to the public to see how things are made, we should do the same in our intellectual pursuits. As experts in our respective fields, we need to be able to not only achieve an outcome but also demonstrate how we got there.

I explained last month how frustrating it is when customer data isn’t used (see Don’t seek to know everything about your customer). Good governance should seek to simplify and explain how both business processes and the associated data work and are applied.

The pressure for “forget me” legislation and better handling of data breaches will be alleviated by transparency. Even better, customers will enjoy using services that they understand.

Tags: ,
Category: Enterprise Data Management, Enterprise2.0, Information Governance
No Comments »

by: RickDelgado
30  Jul  2015

Enterprise Storage: Keeping Up with the Data Deluge

The increasing demands found in data centers can be difficult for most people to keep up with. We now live in a world where data is being generated at an astounding pace, which has lead to expert coining the phrase “big data.” All that generated data is also being collected, which creates even bigger demands for enterprise data storage. Consider all the different trends currently going around, from video and music streaming to the rise of business applications to detailed financial information and even visual medical records. It’s no wonder that storage demands have risen around 50 percent annually in the past few years, and there appears to be nothing on the horizon that will slow that growth. Companies have reason for concern as current data demands threaten to stretch their enterprise storage to its breaking point, but IT departments aren’t helpless in this struggle. This data deluge can be managed; all that’s needed are the right strategies and technologies to handle it.

It isn’t just the fact that so much new data needs to be stored, it’s that all the data should be stored securely while still allowing authorized personnel to access it efficiently. Combine that with the rapidly changing business environment where needs can evolve almost on a daily basis and the demands for an agile and secure enterprise storage system can overwhelm organizations. The trick is to construct infrastructure that can manage these demands. A well designed storage network can relieve many of the headaches that are generated when dealing with large amounts of data, but such a network requires added infrastructure support.

Luckily, IT departments have many options they can choose from that can meet the demands of the data deluge. One of the most popular at the moment is storage virtualization. This technology basically works by combining multiple network storage devices so that they appear to be only one unit. The components for a virtualized storage system, however, can be a tough decision for companies to make. Network attached storage (NAS), for instance, helps people within an organization access the same data at the same time. Storage area networks (SAN) help make planning and implementing storage solutions much easier. Both carry certain advantages over the more traditional direct-attach storage (DAS) deployments seen in many businesses. DAS simply comes with too many risks and downsides, making it a poor choice when confronting the current data challenges many companies face. Whether choosing NAS or SAN, both can simplify storage administration, an absolute must when storage management has become so complex. They also reduce the amount of hardware needed thanks to converged infrastructure technology.

But these strategies aren’t the only one companies can use to keep up with enterprise storage demands. Certain administrative tactics can be deployed to handle the growing volume and complexity of the current storage scene. Part of that strategy is avoiding certain mistakes, such as storing non-critical data on costly storage devices. There’s also the problem of storing too much. In some cases, business leaders ask IT workers to store multiple copies of information, even when the multiple copies aren’t needed. IT departments need to work closely with the business side of the company to devise the right strategy to avoid these unnecessary complications. By streamlining the process, it can become easier to manage storage.

Other options are also readily available to meet enterprise storage demands. Cloud storage, for example, has quickly become mainstream and comes with attractive advantages, such as easy scalability when businesses need it and the ability to access data from almost anywhere. Concerns over data security have made some businesses reluctant to adopt the cloud, but many cloud storage vendors are trying to address those worries with greater emphasis on security features. Hybrid storage solutions are also taking off in popularity in part because they mix many of the advantages found in other storage options.

With the demands large amounts of data are placing on enterprise storage, IT departments are searching for the answers that can help them keep up with these challenges. The options are there that help meet these demands, but it’s up to companies to fully deploy those solutions. Data continues to be generated at a breakneck pace, and that trend won’t be slowing down anytime soon. It’s up to organizations to have the right strategies and technology in place to take full advantage of this ongoing data deluge.

 

Tags: ,
Category: Enterprise Data Management
No Comments »

by: Robert.hillard
25  Jul  2015

Don’t seek to know everything about your customer

I hate customer service surveys. Hotels and retailers spend millions trying to speed our checkout or purchase by helping us avoid having to wait around. Then they undo all of that good work by pestering us with customer service surveys which take longer than any queue that they’ve worked so hard to remove!

Perhaps I’d be less grumpy if all of the data that organisations spend so much time, much of it ours, collecting was actually applied in a way that provided tangible value. The reality is that most customer data simply goes to waste (I argue this in terms of “decision entropy” in chapter 6 of my book, Information-Driven Business).

Customer data is expensive

Many years ago, I interviewed a large bank about their data warehouse. It was the 1990s and the era of large databases was just starting to arrive. The bank had achieved an impressive feat of engineering by building a huge repository of customer data, although they admitted it had cost a phenomenal sum of money to build.

The project was a huge technical success overcoming so many of the performance hurdles that plagued large databases of the time. It was only in the last few minutes of the interview that the real issue started to emerge. The data warehouse investment was in vain, the products that they were passionate about taking to their customers were deliberately generic and there was little room for customisation. Intimate customer data was of little use in such an environment.

Customer data can be really useful but it comes at a cost. There is the huge expense of maintaining the data and there is the good will that you draw upon in order to collect it. Perhaps most importantly, processes to identify a customer and manage the relationship add friction to almost every transaction.

Imagine that you own a clothing or electrical goods store. From your vantage point behind the counter you see a customer run up to you with cash in one hand and a product in the other. They look like they’re in a hurry and thrust the cash at you. Do you a) take the cash and thank them; or b) ask them to stop before they pay and register for your loyalty programme often including a username and password? It’s obvious you should go option a, yet so many retailers go with option b. At least the online businesses have the excuse that they can’t see the look of urgency and frustration in their customers’ eyes, it is impossible to fathom why so many bricks-and-mortar stores make the same mistake!

Commoditised relationships aren’t bad

Many people argue that Apple stores are close to best practice when it comes to retail, yet for most of the customer interaction the store staff member doesn’t know anything about the individual’s identity. It is not until the point of purchase that they actually access any purchase history. The lesson is that if the service is commoditised it is better to avoid cluttering the process with extraneous information.

Arguably the success of discount air travel has been the standardisation of the experience. Those who spend much of their lives emulating the movie Up in the Air want to be recognised. For the rest of the population, who just want to get to their destination at the lowest price possible while keeping a small amount of comfort and staying safe, a commoditised service is ideal. Given the product is not customised there is little need to know much about the individual customers. Aggregate data for demand forecasting can often be gained in more efficient ways including third party sources.

Do more with less

Online and in person, organisations are collecting more data than ever about their customers. Many of these organisations would do better to focus on a few items of data and build true relationships by understanding everything they can from these small number of key data elements. I’ve previously argued for the use of a magic 150 or “Dunbar’s number” (see The rule of 150 applied to data). If they did this, not only would they be more effective in their use of their data, they could also be more transparent about what data they collect and the purposes to which they put it.

People increasingly have a view of the value of their information and they often end-up resenting its misuse. Perhaps, the only thing worse than misusing it is not using it at all. There is so much information that is collected that then causes resentment when the customer doesn’t get the obvious benefit that should have been derived. Nothing frustrates people more than having to tell their providers things that are obvious from the information that they have already been asked for, such their interests, family relationships or location.

Organisations that don’t heed this will face a backlash as people seek to regain control of their own information (see You should own your own data).

Customers value simplicity

In this age of complexity, customers are often willing to trade convenience for simplicity. Many people are perfectly happy to be a guest at the sites they use infrequently, even though they have to re-enter their details each time, rather than having to remember yet another login. They like these relationships to be cheerfully transactional and want their service providers to respect them regardless.

The future is not just more data, it is more tailored data with less creepy insight and a greater focus on a few meaningful relationships.

Tags:
Category: Enterprise Data Management, Enterprise2.0, Information Development, Master Data Management
No Comments »

by: RickDelgado
21  Jul  2015

Anxious About BYOD? Here are Some Tips for Success

Has your organization caved to the pressure of establishing a Bring Your Own Device (BYOD) policy and is now having second thoughts? Making company-wide policy changes and satisfying tech-savvy employees’ desires is just the beginning. Once BYOD is up and running, there are many challenges. The difference between success and failure means addressing key concerns and finding ways to overcome these issues.

 Mobile Device Management

 Security is undoubtedly the most pressing concern with BYOD. Even with a sound policy, the rapidly shifting security landscape is a challenge. The constant updating of devices is, too. You must constantly adapt your threat defenses and corporate policies. Mobile Device Management (MDM) provides many benefits, including a centralized view of data stored on devices. There are many cases of unhappy employees misusing sensitive information or hackers accessing vulnerable mobile networks. The safest approach is when administrators can see the first signs of a breach and take action.

An MDM system provides access control and monitoring of corporate data. Information on a stolen or lost device can be immediately erased. Mobile apps have caused challenges of their own. Many of them collect personal data and store them in the cloud. An important feature to look for is Mobile Application Management, which keeps track of all the apps on your mobile network and even blocks ones known to be particularly risky.

 Vendor Managed Services

 Not every company employs the most needed talent. A cost-effective way to offset this imbalance is to pursue vendor managed services. Consulting organizations have emerged in the mobile era and employ the technology, tools, and methods to efficiently manage data. DataXoom, a mobile virtual network operator, provides MDM, asset management, and even assistance with procuring the best hardware and software. The ultimate goal is to manage the financial cost of bring your own device and managing data on and accessed by mobile devices.

Stay Compliant

Compliance with the latest standards is essential for keeping BYOD in your company. The Payment Card Industry Data Security Standard 3.0 is one you should be following. It provides guidelines and testing procedures related to building a secure network, protecting cardholder data, and implementing effective access control. Also covered are monitoring and testing and maintaining an information security policy that includes all devices, systems, and personnel. The PCI DSS 3.0 standard is also a guideline for internal and external auditors.

Fine Tune Your Policy

A BYOD policy isn’t static. It needs to adapt to changing security risks and company requirements. For the policy to work, you need to identify what devices are permitted on the network, and control information access down to the individual device. Administrators also need to think about password complexity, screen locking, and other security measures.

Other elements of your policy should outline how technical support operates. Also include permitted apps and rules for acceptable websites, materials, and how all of these are monitored. In addition to governing usage, your leaders should also have a plan for what happens when an employee leaves the company. Do they return the phone or do you just remove access to email, company apps, and data?

Some organizations have resorted to a Choose Your Own Device (CYOD) policy. Users are issued corporate owned devices. They may or may not have a pick from approved products. This gives the company more control over compliance and security, while it pays all costs related to the device.

What about Privacy?

Today’s employees have been outspoken about their rights to have personal data on the same device as their work. The challenge is businesses must protect their mobile networks against unauthorized use. Employer access rules have drawn controversy amongst IT policy drafters. While work-related data could be subject to legal investigations down the road, personal information would be exposed as well. The level of control over personal data has been less than ideal for many workers. Yet, privacy matters still need to be addressed.

Conclusion

These are just a few of the main issues regarding corporate BYOD. Implementing the policy takes work, but continual monitoring and adjustments are required for a successful mobile device policy. That means your company and stakeholders must adjust to change. Security challenges, compliance requirements, employee sentiment, and the devices themselves will certainly be in flux in the years to come.

Tags:
Category: Enterprise Data Management
No Comments »

by: RickDelgado
18  Jun  2015

Flash Quietly Taking Over Disk in a Big Data World

Right now, we live in the big data era. What was once looked at as a future trend is now very much our present reality. Businesses and organizations of all shapes and sizes have embraced big data as a way to improve their operations and find solutions to longstanding problems. It’s almost impossible to overstate just how much big data has impacted the world in such a short amount of time, affecting everyone’s life whether we truly comprehend how. That means we live in a world awash in data, and as companies pursue their own big data strategies, they’ve had to rethink how to store all that information. Traditional techniques have proven unable to handle the huge amount of data being generated and collected on a daily basis. What once was dominated by hard disk drives (HDD) is now rapidly changing into a world driven by solid-state drives (SSD), otherwise known as flash storage.

For years, when talking of big data analytics, the assumption was that a business was using disk. There were several reasons for this, the main one being cost. Hard disk drives were simply cheaper, and for the most part they could deal with the increasing workloads placed upon them. The more data measured and generated, however, the more the limitations of HDD were unmasked. This new big data world needed a storage system capable of handling the workload, and thus the migration to flash storage began.

Many, including Gartner, peg 2013 as the year the switch really gained steam. Solid-state arrays had already been a storage strategy up until then, but in 2013 flash storage manufacturers began constructing arrays with new features like thin provisioning, deduplication, and compression. Suddenly, the benefits gained from using flash storage outweighed some of the drawbacks, most notably the higher cost. In a single year, solid-state arrays saw a surge in sales, increasing by more than 180 percent from 2012. With the arrival of flash storage to the mainstream, organizations could begin to replace their hard disk drives with a system more capable of processing big data.

And that’s really a main reason why flash storage has caught on so quickly. SSDs provide a much higher performance than the traditional storage options. Of particular note is the reduction in the time it takes to process data. Just one example of this is the experience from the Coca-Cola Bottling Co., which began collecting big data but was soon met by long delays in production due to having to sort through loads of new information. When the company adopted flash storage solutions, the amount of time needed to process data was cut dramatically. For example, processing jobs taking 45 minutes now only took six. These kind of results aren’t unique, which is why so many other businesses are seeking flash storage as their primary means of storing big data.

Many tech companies are responding to this increased demand by offering up more options in flash storage. SanDisk has recently unveiled new flash systems specifically intended to help organizations with their efforts in big data analytics. The new offerings are meant to be an alternative to the tiered storage often seen in data centers. Other major tech companies, such as Dell, Intel, and IBM, have shown similar support for flash storage, indicating the lucrative nature of offering flash solutions. The growth isn’t just being driven by private companies either; educational institutions have found a need for flash storage as well. MIT researchers announced last year that they would be switching from hard disks to flash storage in order to handle the demands of big data more effectively. The researchers determined that hard disk drives were too slow, so a better performing storage solution was needed.

As can be seen, flash storage has been quietly but surely taking over hard disk’s turf. That doesn’t mean hard disk drives will soon be gone for good. HDD will likely still be used for offline storage — mainly archiving purposes for data that doesn’t need to be accessed regularly. But it’s clear we’re moving into a world where solid-state drives are the most prevalent form of storage. The need to collect and process big data is making that happen, providing new, unique opportunities for all kinds of organizations out there.

Tags: ,
Category: Enterprise Data Management
No Comments »

by: RickDelgado
15  Apr  2015

Cloud Computing and the Industries that Love It

Cloud computing provides greater security, virtually unlimited computing resources for research and development, cost savings, and advanced threat detection methods. With so many reasons to use cloud computing, it’s no wonder many industries have flocked to the new technology. Cloud technology serves as a form of outsourcing for companies, where some data is kept in house for better control, and other data is trusted to a third-party provider. Each industry that benefits from cloud computing has their own specific reasons for adopting the technology, but cloud computing is most profitable for companies that work with emerging markets and need quick and cost effective scalability.

The Chemicals Industry

Chemical companies are being driven to improve their flexibility, reduce costs, improve speed and become more responsive. Cloud computing provides this by transforming chemical businesses into thriving, cloud-based digital businesses. Chemical companies must be prepared to penetrate new markets quickly. Higher speeds and greater visibility is also a continually evolving need in this industry as collaboration becomes more important than ever.

As governments continue to push for green legislation, chemical companies often find themselves in the crosshairs of local and federal government regulating industries. Switching to cloud-based providers offers one way to increase accountability and reduce resource consumption. Additionally, as cost efficiency becomes increasingly important, chemical companies love the fact that cloud computing provides greater operational agility and increased cost savings across the entire industry.

Chemical companies use IaaS and SaaS in an effort to control costs and use virtualization to create private cloud architectures for their businesses. For Example, Dow Chemical met the requirements for 17 European countries by moving its operations to the cloud.

Law Firms

Law firms deal with large amounts of data on a regular basis, and they need to ensure that the data stays safe. In the past, law firms have mainly relied upon in-house servers to manage their operations. As the expense for maintaining the computers, servers, software and hiring IT administrators has grown, cloud computing has become an attractive alternative.

Even the simplest building closure can put a law firm out of reach from its data. This can seriously hamper an attorney’s ability to effectively manage clients and maintain a high level of service. By moving data to the cloud, law firms have become more prepared to deal with disasters and can reduce the possibility of being without crucial data. Lawyers must also increasingly work outside the office to meet with clients and maintain a high level of effectiveness. Cloud computing solves the problem by making the issue of accessing content securely while away from the office a much simpler, and safer endeavor.

There are unique ethical considerations that any law firm must consider when entrusting its data to a third-party. Law firms can maintain control of their data, while still utilizing cloud servers for advanced threat defense, security and applications that are non-sensitive. While law firms must take due diligence and talk specifically with their cloud provider about data center locations, how data is treated, encryption levels, and their duties in the event of a subpoena, the move to the cloud offers a chance for greater efficiency, reliability and cost savings.

Startup Communication Companies

The trend with new startups is to get going quickly, and define the work at a later date. Many startups don’t have a clear mission plan, and rely upon data received from initial product launches to determine the direction a company will take. Startups are in a unique position to be extremely flexible and adaptable. Established companies generally have preferred software applications, complex networking arrangements and a system that requires careful planning before any changes are made to the infrastructure. With a startup, the entire structure of the company can be taken down and rebuilt in a single business day, this makes working in the cloud a dream for new companies.

No industry knows the importance of flexibility more than the communications industry. With new technologies being developed and emerging daily, it’s become increasingly important to have a dynamic and scalable workspace for research and development. The cloud provides an ideal environment for companies that may need terabytes of data one day and a few gigabytes the next. The cloud provides an ideal situation where resources can be effectively managed without having to upgrade hardware, invest in costly data centers and hire several IT administrators to keep things running smoothly.

Government Agencies and Law Enforcement

Government agencies, including the CIA, FBI and local law enforcement are continually evaluating the cloud architecture to determine ways it can be utilized to increase efficiency, manage multiple departments and improve mobility. Governments largely deem that cloud computing is a safe alternative to traditional in-house servers as it provides advanced threat detection and a high degree of security.

Cross-agency cooperation is essential for governments that need to share information on a state and federal level. By keeping information available in the cloud, state agencies can work more effectively with federal authorities. This makes it possible to share information quickly, and improve the ability to stop an advanced threat before it causes harm. Governments can use public cloud services for less critical information, and a private cloud service for the most sensitive data. This provides the best of both worlds.

The Future of Cloud Computing

Any industry that needs a highly secure, adaptable and scalable computing environment can benefit from cloud computing. From the music industry to the local startup that is still defining its purpose, cloud computing can reduce costs, improve efficiency and increase security for any company. As governments continue to impose strict fines and penalties for failing to maintain good security practices, it has become more important than ever to safeguard company and customer information. The cloud does this at a low cost and with great flexibility.

 

Category: Enterprise Data Management
No Comments »

by: RickDelgado
24  Mar  2015

The Debate Continues: The Future Impact of Net Neutrality on the Cloud

The debate over Net Neutrality is far from over. While the recent ruling by the FCC to classify broadband internet as a public utility may have changed the argument, debates will undoubtedly still continue to take place. The effects the decision has on the web will likely not be felt, let alone understood, for many years to come, but that hasn’t stopped speculation over what a neutral internet will actually look like and how companies and internet service providers (ISPs) will be impacted. At the same time, the future of cloud computing has become a hot topic as experts debate if Net Neutrality will be a boost to cloud providers or if the overall effect will be negative. Looking at the current evidence and what many providers, companies, and experts are saying, the only thing that’s clear is that few people can agree on what Net Neutrality will mean for the cloud and all the advantages of cloud computing.

The basic idea of Net Neutrality is, in the simplest of terms, to treat all internet traffic the same. Whether from a small niche social site or a major online retail hub, content would be delivered equally. This sounds perfectly reasonable on the surface, but critics of the Net Neutrality concept say all websites simply aren’t equal. Sites like Netflix and YouTube (mainly video streaming sites) eat up large amounts of bandwidth when compared to the rest of the internet, and as streaming sites grow in popularity, they keep eating up more and more web resources. The theory goes that ISPs would provide internet “fast lanes” to those sites willing to pay the fee, giving them more bandwidth in comparison to other sites, which would be stuck in “slow lanes.” It’s this idea that proponents of Net Neutrality want to guard against, and it’s one of the biggest points of contention in the debate.

Obviously, this is a simplified view of Net Neutrality, but it’s a good background when looking at the effect the new ruling could have on cloud computing. First, let’s take a look at how cloud providers may be affected without a neutral internet. Supporters of Net Neutrality say a “fast lane” solution would represent an artificial competitive advantage for those sites with the resources to pay for it. That could mean a lack of innovation on the part of cloud vendors as they spend added funds to get their data moved more quickly while getting a leg up on their competition. A non-neutral internet may also slow cloud adoption among smaller businesses. If a cloud software provider has to pay more for fast lanes, those costs can easily be passed on to the consumer, which would raise the barrier to cloud use. The result may be declining cloud adoption rates, or at the least performance of cloud-based software may degrade.

On the other side of the coin, critics of Net Neutrality say the effect of the policy will end up damaging cloud computing providers. They’re quick to point out that innovation on the internet has been rampant without new government regulations, and that ISPs could easily develop other innovative solutions besides the “fast lane” approach Net Neutrality supporters are so afraid of. Government rules can also be complicated and, in the case of highly technical fields, would need to be constantly updated as new technology is developed. This may give larger companies and cloud providers an advantage over their competition since they would have the resources to devote to lobbyists and bigger legal budgets to dedicate to understanding new rules. There’s also the concern over getting the government involved in the control of pricing and profits in the first place. Needless to say, many aren’t comfortable with giving that level of control to a large bureaucracy and would rather let market freedom take hold.

Some may say that with the new FCC ruling, these arguments don’t apply anymore, but changes and legal challenges will likely keep this debate lively for the foreseeable future. Will Net Neutrality lead to government meddling in cloud provider pricing and contracts? Will a lack of Net Neutrality slow down cloud adoption and give too much power to ISPs? Unfortunately, there’s no way of knowing the far-reaching consequences of the decision on the cloud computing landscape. It could end up having very little impact in the long run, but for now, it appears Net Neutrality will become a reality. Whether that’s a good or bad thing for the cloud remains to be seen.

 

Tags: ,
Category: Enterprise Data Management
No Comments »

by: Gil Allouche
15  Jan  2015

Keeping Big Data Secure: Should You Consider Data Masking?

Big data is a boon to every industry. And as data volumes continue their exponential rise, the need to protect sensitive information from being compromised is greater than ever before. The recent data breach of Sony Pictures, and new national threats from foreign factions serve as a cautionary tale for government and private enterprise to be constantly on guard and on the lookout for new and better solutions to keep sensitive information secure.

One security solution, “data masking”, is the subject of a November 2014 article on Nextgov.com.

In the article, Ted Girard, a vice president at Delphix Federal, defines what data masking is—along with its applications in the government sector. Being that data masking also has non-government applications, organizations wondering if this solution is something they should consider for original production data should find the following takeaways from the Nextgov article helpful.

The information explosion
Girard begins by stating the plain and simple truth that in this day and age of exploding volumes of information, “data is central to everything we do.” That being said he warns that, “While the big data revolution presents immense opportunities, there are also profound implications and new challenges associated with it.” Among these challenges, according to Girard, are protecting privacy, enhancing security and improving data quality. “For many agencies just getting started with their big data efforts”, he adds, “these challenges can prove overwhelming.”

The role of data masking
Speaking specifically of governmental needs to protect sensitive health, education, and financial information, Girard explains that data masking is, “a technique used to ensure sensitive data does not enter nonproduction systems.” Furthermore, he explains that data masking is, “designed to protect the original production data from individuals or project teams that do not need real data to perform their tasks.” With data masking, so-called “dummy data”— a similar but obscured version of the real data—is substituted for tasks that do not depend on real data being present.

The need for “agile” data masking solutions
As Girard points out, one of the problems associated with traditional data masking is that, “every request by users for new or refreshed data sets must go through the manual masking process each time.” This, he explains, “is a cumbersome and time-consuming process that promotes ‘cutting corners’– skipping the process altogether and using old, previously masked data sets or delivering teams unmasked versions.” As a result, new agile data masking solutions have been developed to meet the new demands associated with protecting larger volumes of information.

According to Girard, the advantage of agile data masking is that it, “combines the processes of masking and provisioning, allowing organizations to quickly and securely deliver protected data sets in minutes.”

The need for security and privacy
As a result of collecting, storing and processing sensitive information of all kinds,
government agencies need to keep that information protected. Still, as Girard points out, “Information security and privacy considerations are daunting challenges for federal agencies and may be hindering their efforts to pursue big data programs.” The good news with “advance agile masking technology”, according to Girard, is that it helps agencies, “raise the level of security and privacy assurance and meet regulatory compliance requirements.” Thanks to this solution, Girard says that, “sensitive data is protected at each step in the life cycle automatically.”

Preserving data quality
Big data does not necessarily mean better data. According to Girard, a major cause of many big data project failures is poor data. In dealing with big data, Girard says that IT is faced with two major challenges:

1. “Creating better, faster and more robust means of accessing and analyzing large data sets…to keep pace.”
2. “Preserving value and maintaining integrity while protecting data privacy….”

Both of these challenges are formidable, especially with large volumes of data migrating across systems. As Girard explains, “…controls need to be in place to ensure no data is lost, corrupted or duplicated in the process.” He goes on to say that, “The key to effective data masking is making the process seamless to the user so that new data sets are complete and protected while remaining in sync across systems.”

The future of agile data masking
Like many experts, Girard predicts that big data projects will become a greater priority for government agencies over time. Although not mentioned in the article, the NSA’s recent installation of a massive 1.5 billion-dollar data center in Utah serves as a clear example of the government’s growing commitment to big data initiatives. In order to successfully analyze vast amounts of data securely and in real time going forward, Girard says that agencies will need to, “create an agile data management environment to process, analyze and manage data and information.”

In light of growing security threats, organizations looking to protect sensitive production data from being compromised in less-secure environments should consider data masking as an effective security tool for both on-premise and cloud-based big data platforms.

Tags: ,
Category: Enterprise Data Management
No Comments »

Calendar
Collapse Expand Close
TODAY: Tue, November 21, 2017
November2017
SMTWTFS
2930311234
567891011
12131415161718
19202122232425
262728293012
Archives
Collapse Expand Close
Recent Comments
Collapse Expand Close