February 24th 2015 Posted at Uncategorized Comments Off on A Case Study for Analytics
Traditional direct marketing suffers from low data quality in databases that directly impacts their analytics and marketing rate of return. A classic case study for analytics is creation of unique customer identity keys, thus making the analytics of the business customer -oriented rather than account oriented. This account orientation versus customer single view orientation tradeoff can be due to the varying degrees of complexity in data population across several accounts , sub -business domain and marketing relationships.
For example-
I may be A Ohri in a telecom company’s account as a broad subscriber, but in another table my name may be written as Ajay O as a prepaid card holder. Different account numbers are allotted to the same customer (here-me) because of the different relationships I may have (mobile broadband and pre-paid mobile). My address may be R32b in one address field, and R Block 32 in another table.
Yet it would be bad marketing by the telecom company if it sends me advertising messages on my broadband Internet and short messages (SMS) on my pre-paid mobile for different marketing activities without taking a customer centric view and only relying on account centric view.The problem in creating unique id from multiple accounts , can be resolved by text mining, and building in short relationships. Using fuzzy logic to match also results in better matching and analysis than simple exact joins.
For a certain client, we had 88 million accounts belonging to 55 million customers. It took us 3 hours to write a single query using SQL to create a modified key to this dataset. In the end we managed to create a rule which created an alpha-numeric primary key to the database – as First Initial of First Name concatenated with surname concatenated with Pin Code of the address. Of course , there were many permutations and combinations that were tried out – and the uniqueness of each one of them was validated and compared along with curves. For each permutation and combination of digits of names and addresses , resulted in different sets of total unique customers. If you narrow down the criterion too much (like just the first initial in name concated to last initial of surname) the number of unique values would shrink as many many duplicates would be created. Thus the tradeoff was between duplication and uniqueness.
From this alphanumeric key- we could ensure uniqueness in the database marketing efforts.
What were the results of this marketing campaign? Well, it found that some customers had been mailed with the same offer three times earlier while some customers were treated as dormant accounts because the earlier unique account id was insufficient for analytical purposes. Of course, one of the primary ways to validate was that the business knew approximately how many unique customers would exist in total, based on the billable relationships and other data.
The new unique account id eliminated the need for excessive marketing or spamming, and each customer could be treated based on total value across entire relationships. Thus this very basic analytics exercise in providing a valid unique key paved the way for advanced business analytics like Recency Frequency Monetization and regression modeling.
Take a look at the databases that you are working on. Are you sure you are looking at customers and not accounts. An account number is just 8 bits of information, but a customer is a human being. Treating people better through analytics training than treating them as just 8 bits of numbers is the key to a successful analytics practice. Creating a better unique id is the first analytical step to better knowledge discovery in databases.
Sas analytics requires a certain amount of domain knowledge and technical expertise. Respecting your own brand, your boundaries, your own biases and being mindful of customer delight can enable you to work, survive and thrive in this career as an independent analytics courses.
February 24th 2015 Posted at Uncategorized Comments Off on Splunk Put Into Innovative Use By Npr For Web Analytics
In a novel use of the software, National Public Radio is using the Splunk log search engine to analyze Web traffic for its audio streams and downloads. Splunk offers what it calls a search engine for machine data. It was originally built to parse log files, or the files programs and hardware generate to document their transactions, errors and other operational information. By coordinating the timestamps of messages from different applications and hardware, Splunk allows system administrators to pinpoint difficult-to-locate system problems.
In recent years, however, customers have been expanding their uses of Splunk to other duties, explained Splunk Chief Technology Officer Erik Swan. Web traffic analysis and business intelligence are two such ancillary uses.
For much of its Web traffic monitoring, NPR uses standard Web traffic analytic software, which can deliver reports on how many people visit each Web page. Such software usually generates these counts by using cookies or by embedding each page with a small script that alerts the software when the page is rendered in a browser.
The media organization, however, found it difficult to get reliable usage summaries for a number of aspects of its service. For instance, the organization needed to get an accurate count of how many listeners tuned into their streamed audio and video programs.
To get this data, NPR had prepared a PHP script that would parse the server log files and translate the results into a form that could be digested by Adobe’s Omniture, a Web analytic tool. Getting information back, however, could take up to 24 hours.
In the cases of streaming usage, many users might start a stream, then pause it, and restart it. Or perhaps a user would restart a stream after a failed Internet connection. In the server log files, all these events were logged as separate events, not a linear sequence of actions by a single user. As a result, there was no way of determining how many connections were from different listeners, and how many were multiple streams to a single user.
By working with Splunk, NPR could derive listener numbers and information directly from its servers’ log files. The software allows users to script search results and then graph the results, or show them on a dashboard.
Splunk helped identify users’ mobile platforms as well. An increasing amount of traffic to the NPR site comes from mobile clients, such as iPhones, iPads and Android smartphones. In one case, a manager wanted to know which version of the iPhone operating system was most often used, as the results would direct the company’s design work for its iPhone app.
Splunk also solved a seemingly unsolvable problem for the organization: determining how to pay royalties for streamed songs. NPR offers a streaming service for songs, called SoundExchange. It must pay out royalties for each song played, based on the number of listeners that stream had at the moment.
Using Splunk, it is possible to merge two files – a list of when each song was played, and the number of listeners that stream had when the song was played.
I m Marry Willimas. I m covering business intelligence tools for last 20 Years. I m presenting my views on web analytics technology in the world.
February 24th 2015 Posted at Uncategorized Comments Off on Driving theory test: Complete information
Theory test introduction:
The theory test is created with two parts; the multiple choice section and the hazard perception section. If you pass one section and fail the other you’ll fail the entire test, and you’ll require taking both sections again. Immediately after you have passed the theory test you can then apply to take your practical driving test.
The theory test consists of 50 questions in a multiple choice format and you need at least 43 correct answers to pass. All the questions will look on a computer screen which is touch sensible. The questions are designed to be very easy to read, only looking one at a time. It will be possible to skip forwards and backwards through the questions, and can take up to 57 minutes to complete the theory test.
Before you can apply for your practical driving test, you must pass your theory test.
The questions may cover the following topics
:-
Alertness
Safety and Your Vehicle
Attitude
Vulnerable Road Users
Safety Margins
Hazard Awareness
Vehicle Loading
Vehicle Handling
Other Types of Vehicle (Motorcyclists, Lorries, Buses)
Rules of the Road
Motorway Rules
Road and Traffic Signs
Accidents
If you have special needs then at the time of your theory test booking it is important that you state your needs so that the necessary arrangements can be made by DSA.
The theory test can be taken in 20 other languages through the use of a headset giving a voice-over. The available languages are- Spanish, Albanian, Arabic, Farsi, Cantonese, Hindi, Bengali, Turkish, Gujarati, Dari, Kashmiri, Punjabi, Kurdish, Polish, Mirpuri, Portuguese, Pushto, Tamil and Urdu.
You may be able to take a translator, if your language is not offered. The translator must be approved by the DSA and at present can only be accommodated at the following theory test centers- Aldershot, Birmingham, Derby, Birkenhead, Cardiff, Edinburgh, Ipswich, Preston, Leeds, Milton Keynes, Glasgow and Palmers Green.
If you fail your theory test, you may retake the theory test as many times. However, you must wait a minimum of three working days between each test attempt. You must prepare your test vary well before you attempt your theory test will save your time and money.
In most test centres you will get your result and, if you pass, your theory test pass certificate within about half an hour of completing the theory test.
In most test centres you get your test result within about half an hour of completing the theory test and if you pass, you get your theory test pass certificate as well.
You may be able to simply exchange it to a UK licence, If you hold a foreign license. You would need to ask to the Driver Vehicle Licencing Agency. You do not have to take a theory test if you are updating your current full UK licence.
John Graham is involved with www.theory-test.co.uk in writing useful driving test tips, theory test books, driving theory test questions etc for years. you can get online Driving test preparation material.
February 24th 2015 Posted at Uncategorized Comments Off on Information Systems Theory 101
“The first on-line, real-time, interactive, data base system was double-entry bookkeeping which was developed by the merchants of Venice in 1200 A.D.” – Bryce’s Law
Systems work is not as hard as you might think. However, we have a tendency in this business to complicate things by changing the vocabulary of systems work and introducing convoluted concepts and techniques, all of which makes it difficult to produce systems in a consistent manner. Consequently, there is a tendency to reinvent the wheel with each systems development project. I believe I owe it to my predecessors and the industry overall to describe basic systems theory, so that people can find the common ground needed to communicate and work. Fortunately, there are only four easy, yet important, concepts to grasp which I will try to define as succinctly as possible.
1. THERE ARE THREE INHERENT PROPERTIES TO ANY SYSTEM
Regardless of the type of system, be it an irrigation system, a communications relay system, an information system, or whatever, all systems have three basic properties:
A. A system has a purpose – such as to distribute water to plant life, bouncing a communications signal around the country to consumers, or producing information for people to use in conducting business.
B. A system is a grouping of two or more components which are held together through some common and cohesive bond. The bond may be water as in the irrigation system, a microwave signal as used in communications, or, as we will see, data in an information system.
C. A system operates routinely and, as such, it is predictable in terms of how it works and what it will produce.
All systems embrace these simple properties. Without any one of them, it is, by definition, not a system.
For our purposes, the remainder of this paper will focus on “information systems” as this is what we are normally trying to produce for business. In other words, the development of an orderly arrangement or grouping of components dedicated to producing information to support the actions and decisions of a particular business. Information Systems are used to pay employees, manage finances, manufacture products, monitor and control production, forecast trends, process customer orders, etc.
If the intent of the system is to produce information, we should have a good understanding of what it is…
2. INFORMATION = DATA + PROCESSING
Information is not synonymous with data. Data is the raw material needed to produce information. Data by itself is meaningless. It is simply a single element used to identify, describe or quantify an object used in a business, such as a product, an order, an employee, a purchase, a shipment, etc. A data element can also be generated based on a formula as used in a calculation; for example:
Only when data is presented in a specific arrangement for use by the human being does it become information. If the human being cannot act on it or base a decision from it, it is nothing more than raw data. This implies data is stored, and information is produced. It is also dependent on the wants and needs of the human being (the consumer of information). Information, therefore, can be defined as “the intelligence or insight gained from the processing and/or analysis of data.”
The other variable in our formula is “processing” which specifies how data is to be collected, as well as its retrieval in order to produce information. This is ultimately driven by when the human being needs to make certain actions and decisions. Information is not always needed “upon request” (aka “on demand”); sometimes it is needed once daily, weekly, monthly, quarterly, annually, etc. These timing nuances will ultimately dictate how data is collected, stored, and retrieved. To illustrate, assume we collect data once a week. No matter how many times during the week we make a query of the data base, the data will only be valid as of the last weekly update. In other words, we will see the same results every day for one week. However, if we were to collect the data more frequently, such as periodically throughout the day, our query will produce different results throughout the week.
Our formula of “I = D + P” makes an important point: if the data is changed, yet the processing remains the same, the information will change. Conversely, if the data remains the same, yet the processing changes, the information will also change. This leads to a compelling argument to manage data and processing as separate by equal resources which can be manipulated and reused to produce information as needed.
3. SYSTEMS ARE LOGICAL IN NATURE AND CAN BE PHYSICALLY IMPLEMENTED MANY DIFFERENT WAYS
An information system is a collection of processes (aka, “sub-systems”) to either collect and store data, to retrieve data and produce information, or a combination of both. The cohesive bond between these components is the data which should be shared and reused throughout the system (as well as other systems). You will observe we have not yet discussed the most suitable way to physically implement the processes, such as through the use of manual processes, computer programs, or other office technology. In other words, at this stage, the sub-systems of the system simply define logically WHAT data must be processed, WHEN it must be processed, and who will consume the information (aka “end-users”), but it most definitely does not specify HOW the sub-system is to be implemented.
Following this, developers determine a suitable approach for physically implementing each sub-system. This decision should ultimately be based on practicality and cost effectiveness. Sub-systems can be implemented using manual procedures, computer procedures (software), office automation procedures, or combinations of all three. Depending on the complexity of the sub-system, several procedures may be involved. Regardless of the procedures selected, developers must establish the precedent relationships in the execution of the procedures, either sequentially, iteratively, of choice (thereby allowing divergent paths). By defining the procedures in this manner, from start to end, the developers are defining the “work flow” of the sub-system, which specifies HOW the data will be physically processed (including how it is to be created, updated, or referenced).
Defining information systems logically is beneficial for two reasons:
* It provides for the consideration of alternative physical implementations. How one developer designs it may very well be different than the next developer. It also provides the means to effectively determine how a purchased software package may satisfy the needs. Again, the decision to select a specific implementation should be based on practicality and cost justification.
* It provides independence from physical equipment, thereby simplifying the migration to a new computer platform. It also opens the door for system portability, for example; our consulting firm helped a large Fortune 500 conglomerate design a single logical payroll system which was implemented on at least three different computer platforms as used by their various operating units; although they physically worked differently, it was all the same basic system producing the same information.
These logical and physical considerations leads to our final concept…
4. A SYSTEM IS A PRODUCT THAT CAN BE ENGINEERED AND MANUFACTURED LIKE ANY OTHER PRODUCT.
An information system can be depicted as a four level hierarchy (aka, “standard system structure”):
LEVEL 1 – System
LEVEL 2 – Sub-systems (aka “business processes”) – 2 or more
LEVEL 3 – Procedures (manual, computer, office automation) – 1 or more for each sub-system
LEVEL 4 – Programs (for computer procedures), and Steps (for all others) – 1 or more for each procedure
Click for diagram: http://www.phmainstreet.com/mba/pride/issss.jpg
Each level represents a different level of abstraction of the system, from general to specific (aka, “Stepwise Refinement” as found in blueprinting). This means design is a top-down effort. As designers move down the hierarchy, they finalize design decisions. So much so, by the time they finish designing Level 4 for a computer procedure, they should be ready to write program source code based on thorough specifications, thereby taking the guesswork out of programming.
The hierarchical structure of an information system is essentially no different than any other common product; to illustrate:
LEVEL 1 – Product
LEVEL 2 – Assembly – 2 or more
LEVEL 3 – Sub-assembly – 1 or more for each assembly
LEVEL 4 – Operation – 1 or more for each sub-assembly
Again, the product is designed top-down and assembled bottom-up (as found in assembly lines). This process is commonly referred to as design by “explosion” (top-down), and implementation by “implosion” (bottom-up). An information system is no different in that it is designed top-down, and tested and installed bottom-up. In engineering terms, this concept of a system/product is commonly referred to as a “four level bill of materials” where the various components of the system/product are defined and related to each other in various levels of abstraction (from general to specific).
This approach also suggests parallel development. After the system has been designed into sub-systems, separate teams of developers can independently design the sub-systems into procedures, programs, and steps. This is made possible by the fact that all of the data requirements were identified as the system was logically subdivided into sub-systems. Data is the cohesive bond that holds the system together. From an engineering/manufacturing perspective it is the “parts” used in the “product.” As such, management of the data should be relegated to a separate group of people to control in the same manner as a “materials management” function (inventory) in a manufacturing company. This is commonly referred to as “data resource management.”
This process allows parallel development, which is a more effective use of human resources on project work as opposed to the bottleneck of a sequential development process. Whole sections of the system (sub-systems) can be tested and delivered before others, and, because data is being managed separately, we have the assurance it will all fit together cohesively in the end.
The standard system structure is also useful from a Project Management perspective. First, it is used to determine the Work Breakdown Structure (WBS) for a project complete with precedent relationships. The project network is then used to estimate and schedule the project in part and in full. For example, each sub-system can be separately priced and scheduled, thereby giving the project sponsors the ability to pick and chose which parts of the system they want early in the project.
The standard system structure also simplifies implementing modification/improvements to the system. Instead of redesigning and reconstructing whole systems, sections of the system hierarchy can be identified and redesigned, thereby saving considerable time and money.
This analogy between a system and a product is highly credible and truly remarkable. Here we can take a time-proven concept derived from engineering and manufacturing and apply it to the design and development of something much less tangible, namely, information systems.
CONCLUSION
Well, that’s it, the four cardinal concepts of Information Systems theory. I have deliberately tried to keep this dissertation concise and to the point. I have also avoided the introduction of any cryptic vocabulary, thereby demonstrating that systems theory can be easily explained and taught so that anyone can understand and implement it.
Systems theory need not be any more complicated than it truly is.
If you would like to discuss this with me in more depth, please do not hesitate to send me an e-mail at timb001@phmainstreet.com
(For Milt and Les).
Tim Bryce is a writer and management consultant located in Palm Harbor, Florida. http://www.phmainstreet.com/timbryce.htm
Lecture 1 of the Course on Information Theory, Pattern Recognition, and Neural Networks. Produced by: David MacKay (University of Cambridge) Author: David Ma…
February 22nd 2015 Posted at Uncategorized Comments Off on The Latin America Cloud Analytics Market is Expected to Reach $688.5 Million in 2019 – A Report by MicroMarket Monitor
(PRWEB) January 28, 2015
Latin America Cloud Analytics market report defines and segments the Cloud Analytics market in Latin America with analysis and forecast of revenue. This market is estimated to grow from $ 191.7 million in 2014 to $ 688.5 million in 2019, at a Compound Annual Growth Rate (CAGR) of 29.1% from 2014 to 2019.
Browse through the TOC of the Latin America Cloud Analytics market report to get an idea of the in-depth analysis provided. It also provides a glimpse of the segmentations in the market, and is supported by various tables and figures.
Cloud analytics is a cloud-enabled service model that allows an organization or individual to perform business analysis by providing the analytics elements though public, private and hybrid cloud. Data analytics and Cloud Computing are the factors that have led to the evolution of data analysis through cloud based analytics. They have enabled the use of cloud based business analytics in small, medium as well as large enterprises.
In Latin America various factors, such as, cost-effectiveness, easy installment and high growth in big data are driving the cloud analytics market in this region. The key players of this market include IBM, Oracle, SAP, SAS, Salesforce and others. Many of IBM’s popular business analytics applications are now available in cloud-based or software as a service (SaaS) editions. These solutions are not only initiating business analytics deployments up and running in minutes, but are also reducing the costs and risks of development and deployment.
Early buyers will receive 10% customization on this report.
Being a Latin America holds the substantial share of the overall cloud analytics market which is expected to grow exponentially by the end of 2019. The organizations operational in this region are considered to be the most techno savvy and innovative. The quest of having a competitive edge over other competitors leads these organizations to exploit the advanced technologies and get the best out of them. Moreover, rapid adoption of mobile technology and extensive use of social media is also contributing to the growth of cloud analytics market in Latin America.
Latin America Cloud Analytics market is segmented and forecast on the basis of solutions such as Cloud BI tools, Hosted Data Warehouse Solutions, Complex Event Processing Enterprise, Performance Management, Governance, Risk and Compliance, Analytics Solutions. Also it is segmented on the basis of industry verticals such as BFSI, Healthcare, Media & Entertainment, Telecommunication, Consumer Goods & Retail, Business & Consulting, Education & research, Manufacturing & Energy. The market is further segmented and forecasted on the basis of End-users such as Enterprises, Medium Business& Small Business (SMB). The Latin America Cloud Analytics market is also segmented on the basis of major countries in this region such as the Brazil, Mexico and Columbia.
The report also includes market share and value chain analyses, and market metrics such as drivers and restraints. In addition, it presents a competitive landscape and company profiles of the key players in this market.
Related Report:
Europe Cloud Analytics Market:
The European cloud analytics market is expected to grow from $ 1,482.0 million in 2014 to $ 4,052.9 million by 2019, at an estimated CAGR of 22.3%, during the forecast period. This market contributes to 22.54% of the global market in 2014. Market can be segmented by Geographies, Applications, Macro Indicators, Companies, Types and Technologies.
MicroMarket Monitor identifies and attends to various unmet needs of different industrial verticals, which include value chain impact analysis. The company publishes about 12000 Market Research Reports on various Micro Markets across the world. The graphical nature and multidimensional analysis of these reports provide advanced Business Intelligence Tools to the clients in that particular target market.
February 13th 2015 Posted at Uncategorized Comments Off on CICA 2015 International Conference Focuses on Increasing The Strategic Advantages of Captives
(PRWEB) January 20, 2015
The Captive Insurance Companies Association (CICA) 2015 International Conference
March 8-10 at the Loews Royal Pacific, Orlando, Florida offers a full agenda designed to explore the strategic advantages of captives across a wide range of business needs.
“Attendees will see our theme Captives: The Strategic Advantage is clearly addressed throughout 23 educational sessions. Those sessions will identify innovative uses of captives to enhance and strengthen business performance,” says Dennis P. Harwick, CICA President.
Conference program highlights include:
“How to Understand and Get the Most Out of Your Actuarial Report” – Demystifies report terminology and helps you identify underutilized information and better understand how your organization can use the actuarial report information for strategic risk management.
“Using Captives to Corral Credit & Political Risk” –Case studies will illustrate how risks are identified and addressed, including how captive utilization can enhance long-term risk management goals.
“Healthcare Captives…Survival of the Fittest” – Focuses on how to build a better healthcare captive, how to assess which risks to add and understanding the regulatory impact of the Affordable Care Act.
Additional program sessions include:
“New Business Strategy: Designing and Implementing a Cyber Risk Insurance Program”
“CICA’s Best Practices and Strategies for Transfer Pricing”
“Developing the Operational Strategy of Managing Medical Stop Loss in Your Captive”
“Big Data and Captives: Using Data and Dashboards to your Strategic Advantage”
“Developing the Next Generation of Captive Professionals”
Special thanks to conference Gold Sponsors Johnson Lambert, Saslow, Lufkin & Buggy LLP, and Zurich for their support and helping to make the 2015 CICA International Conference possible.
For Information on the full program, travel information and registration fees for the CICA 2015 International conference visit the CICA website.
About the Captive Insurance Companies Association (CICA)
CICA is the only global domicile-neutral captive insurance association. CICA is committed to providing the best source of unbiased information, knowledge, and leadership for captive insurance decision makers. CICA is your advocate around the world, key to the captive industry and the resource for captive best practices.
February 9th 2015 Posted at Uncategorized Comments Off on AvePoint Provides Broader Business Protection through Data Loss Prevention with Launch of Compliance Guardian SP 3
Jersey City, NJ (PRWEB) January 13, 2015
AvePoint, the established leader in enterprise-class big data management, governance, and compliance software solutions for next-generation social collaboration platforms, today announced the general availability of AvePoint Compliance Guardian Service Pack (SP) 3. Compliance Guardian mitigates privacy, information security, and compliance risks across your information gateways with a comprehensive risk management process allowing organizations to document their policies, implement and measure them, and demonstrate conformance.
Compliance Guardian enables organizations to understand their current environment through data discovery and then allows organizations to build a culture of compliance in which the business can say what they do to protect sensitive data, do what they say, and prove that they have appropriate controls in place for its sensitive data. Capabilities include:
Data Leakage Detection – Compliance staff are able to centrally monitor the status associated with any incident, as well as its associated risk levels to ensure critical violations are prioritized according to business need. Combined with action trend reports and detailed historical analysis, organizations will be able to track and manage incidents more efficiently to drive a successful risk management lifecycle.
Data at Rest – AvePoint Compliance Guardian helps you discover dark data wherever it lives in your current IT environment through its extensible APIs. Compliance Guardian data repository scans are currently available out-of-the-box for file shares; databases; Web sites, applications, and Web-based systems; SharePoint; Cloud; and social platforms – including Microsoft Lync and Yammer.
Data Identification – Protect regulated and sensitive data from harmful leak or misuse with scheduled and real-time context-aware reporting and classification. This includes Fingerprinting, Compliance Guardian’s latest check type which allows administrators to use file patterns as test criteria in order to identify files that are identical or similar.
Compliance Guardian is designed to ensure that information is available and accessible to the people who should have it and protected from the people who should not by taking a comprehensive lifecycle approach.
Discover data across multiple information gateways in your enterprise to shed light on dark data and other potential sources of risk
Scan content in motion or at rest against out-of-the-box or customized checks for a wide range of privacy, information assurance, operational security, sensitive security information, and accessibility requirements
Drive enterprise classification and taxonomy with user-assisted and automated classification
Take corrective action automatically to secure, delete, move, quarantine, encrypt, or redact risk defined content
Enhance incident tracking and management with an integrated incident management system in addition to trend reports and historical analysis to measure your organization’s compliance improvements over time
Monitor data and systems on an ongoing basis to demonstrate and report on conformance across your enterprise wide information gateways and systems with one system that works where you work
“Trust is something that businesses must work to establish with their customers every day. Once lost, it is very difficult to regain. Trust requires accountability and transparency,” said Dana Simberkoff, Chief Compliance and Risk Officer, AvePoint. “Compliance Guardian enables organizations to understand their current environment through data discovery and then allows organizations to build a culture of compliance in which the business can say what they do to protect sensitive data, do what they say, and prove that they have appropriate controls in place for its sensitive data.”
AvePoint is the established leader in enterprise-class big data management, governance, and compliance software solutions for next-generation social collaboration platforms. Focusing on helping enterprises in their digitization journey to enable their information workers to collaborate with confidence, AvePoint is first to market with a unique solution that centralizes access and control of information assets residing in disparate collaboration and document management systems on-premises and in the cloud. AvePoint solutions and services aim to bring together business, IT, as well as compliance and risk officers to serve key business objectives such as big data, cloud integration, compliance, enterprise content management, and mobile data access monitoring.
Founded in 2001 and based out of Jersey City, NJ, AvePoint serves more than 13,000 organizations in five continents across all industry sectors, with focused practices in the energy and utilities; financial services; healthcare and pharmaceuticals; and public sector industries. AvePoint is a Microsoft Global ISV Partner, Gold Certified Collaboration and Content Partner and winner of 2014 Microsoft Partner of the Year Award in Public Safety and National Security, as well as a US Government GSA provider via strategic partnerships. AvePoint is privately held and backed by Goldman Sachs and Summit Partners.
All product and company names herein may be trademarks of their registered owners.
February 5th 2015 Posted at Uncategorized Comments Off on Identity Theft Resource Center Breach Report Hits Record High in 2014
SAN DIEGO, Calif. (PRWEB) January 12, 2015
The number of U.S. data breaches tracked in 2014 hit a record high of 783 in 2014, according to a recent report released by the Identity Theft Resource Center (ITRC) and sponsored by IDT911™. This represents a substantial hike of 27.5 percent over the number of breaches reported in 2013 and a significant increase of 18.3 percent over the previous high of 662 breaches tracked in 2010. The number of U.S. data breach incidents tracked since 2005 also hit a milestone of 5,029 reported data breach incidents, involving more than 675 million estimated records.
Continuing a three-year trend, breaches in the Medical/Healthcare industry topped the ITRC 2014 Breach List with 42.5 percent of the breaches identified in 2014. The Business sector continued in its second place ranking with 33.0 percent of the data breach incidents, followed by the Government/Military sector at 11.7 percent. These categories were followed by the Education sector at 7.3 percent and Banking/Credit/Financial at 5.5 percent.
“With support from IDT911, the ITRC has been able to continue its efforts in tracking and understanding the complex issues surrounding the growing number of data breaches,” said Eva Velasquez, President and CEO, ITRC. “With an average of 15 breaches a week in 2014, consumers need to be made aware of the risk of exposure to personal identifying information in order to understand the threat posed by this growing list of data breach incidents.”
“The ubiquitous nature of data breaches has left some consumers and businesses in a state of fatigue and denial about the serious nature of this issue. While not all breaches will result in identity theft or other crimes, the fact that information is consistently being compromised increases the odds that individuals will have to deal with the fall out. The ITRC data breach reports are a necessary educational tool for businesses, government and advocates alike in our communication efforts,” Velasquez added.
In 2014, Hacking incidents represented the leading cause of data breach incidents, accounting for 29.0 percent of the breaches tracked by the ITRC. This was followed for the second year in a row by breaches involving Subcontractor/Third Party at 15.1 percent. Accidental Exposure of information in 2014 jumped to 11.5 percent, up from 7.5 percent recorded in 2013. Data on the Move dropped to 7.9 percent from the 12.9 percent identified in 2013.
The ITRC began tracking data breaches in 2005 and since that time has been maintaining an extensive database capturing and categorizing U.S. data breaches into five industry sectors with a number of other attributes such as how information was compromised and type of data.
“It is important to note that the 5,000 breach milestone only encompasses those reported – many breaches fly under the radar each day because there are many institutions that prefer to avoid the financial dislocation, liability and loss of goodwill that comes with disclosure and notification,” said Adam Levin, founder and chairman of IDT911. “Additionally, not all businesses are required to report they’ve had a breach for a variety of reasons, which means the number of breaches and records affected is realistically much higher.”
From 2007 to 2011, the Business sector, with a 10-year average of 34.3 percent, represented the largest percentage of breaches, often far surpassing the next highest category. The Medical/Healthcare sector, with a 10-year average of 26.4 percent, took over the top spot in 2012, attributed primarily to the mandatory reporting requirement for healthcare breaches being reported to the Department of Health and Human Services (HHS). In 2005, this category reported the least number of breaches.
In 2005 and 2006, Education and Government/Military held the spots for most breaches, at 47.8 percent and 30.8 percent respectively. As indicated in the above chart, these two categories now represent a 10-year average of 15.3 percent and 15.8 percent. The Banking/Credit/Financial industry, with a 10-year average of 8.1 percent, has reported the least number of breaches for nine of the past 10 years.
Over the years, hacking has been a primary cause of data breach incidents, leading to an 8-year average of 21.7 percent. Data on the Move, a leading cause of breaches in 2007 and 2008, ranks second with an average of 15.9 percent. (This category includes storage devices or laptops lost in transit.) Insider theft and Accidental Exposure follow at just over 12 percent, and Subcontractor/Third Party follows at 11.2 percent.
“Without a doubt, 2015 will see more massive takedowns, hacks, and exposure of sensitive personal information like we have witnessed in years past,” said Levin. “Medical data and business information like intellectual property will be prime targets, with cyber thieves looking for opportunistic financial gain based on black market value, corporate extortion and cyber terrorism.”
As indicated in the chart above, other categories have been added to the ITRC database over the years. Employee Negligence was added in 2012 (3-year average = 9.5 percent) and Physical Theft was added in 2014 (12.5 percent of 2014 breaches). An eight-year average of 29.6 percent of all breaches, reported on and tracked by the ITRC, did not have sufficient information to identify the cause.
The ITRC continues to track paper breaches even though these types of breaches seldom trigger state breach notification laws. This type of occurrence has dropped considerably since the high of 26 percent recorded in 2009, with an 8-year average of 17.1 percent.
It is also noteworthy that the reporting of “Unknown”, for the total number of records exposed on the ITRC Breach List, has shown a decline since the high of 50.3 percent in 2012. This again may be due to the high percentage of medical/healthcare entities which are required to report the number of records. The eight-year average of “unknown” is 41.5 percent.
Tracking of breaches involving Social Security numbers and Credit Card/Debit Card information began in 2010. The exposure of SSN’s has shown a definite decline over the past five years since the high in 2010 of 62.1 percent. The same cannot be said for credit/debit cards, which reflected a rise in both 2013 and 2014. The five-year averages for these two categories are 51.3 percent and 19.8 percent respectively.
“I would love to report that the decline in breaches exposing SSNs is a testament to increased security efforts by institutions to protect the golden ticket to a consumer’s identity,” continued Levin. “Unfortunately, those compromises have been dwarfed by the alarming and exponential rise in successful attacks on point-of-sale systems at big box retailers. The FBI estimates that more than 1,000 retailers are under assault with the same (or tweaked versions) of the malware that compromised Target and Home Depot.”
He concludes, “therefore, it is incumbent on consumers to be their own best guardian by controlling what personal information they make available in order to minimize their risk of exposure, monitoring their accounts daily so they know as quickly as possible they have an issue, and having a damage control program to help them get through identity related problems quickly, efficiently and thoroughly.”
For 10 years, the ITRC has been committed to dedicating resources to providing the most accurate review and analysis of U.S. data breach incidents. This has long involved adding new categories and updating methodologies to best capture patterns and any new trends.
“Maintaining a quality multi-year data breach incident database necessitates constant attention to what is happening in the world of data breaches, legislative efforts on breach notification and industry reactions to incident response preparedness,” said Karen Barney, ITRC data breach analyst. “As a credible and thorough resource, the ITRC is able to provide unique insight and information to consumers and businesses alike.”
About the ITRC Breach List
The ITRC Breach List is a compilation of data breaches confirmed by various media sources and/or notification lists from state governmental agencies. Breaches on this list typically have exposed information that could potentially lead to identity theft, including Social Security numbers, financial account information, driver’s license numbers and medical information. This data breach information, and available statistics, have become a valuable resource for media, businesses and consumers looking to become more informed on the need for best practices, privacy and security measures in all areas – both personal and professional.
About the ITRC
Founded in 1999, the Identity Theft Resource Center® (ITRC) is a nationally recognized non-profit organization which provides victim assistance and consumer education through its toll-free call center, website and highly visible social media efforts. It is the mission of the ITRC to: provide best-in-class victim assistance at no charge to consumers throughout the United States; educate consumers, corporations, government agencies, and other organizations on best practices for fraud and identity theft detection, reduction and mitigation; and, serve as a relevant national resource on consumer issues related to cybersecurity, data breaches, social media, fraud, scams, and other issues. Visit http://www.idtheftcenter.org. Victims may contact the ITRC at 888-400-5530.
About IDT911™ (IDentity Theft 911®)
Founded in 2003, IDT911™ is the nation’s premier consultative provider of identity and data risk management, resolution and education services. The company serves 17.5 million households across the country and provides fraud solutions for a range of organizations, including Fortune 500 companies, the country’s largest insurance companies, employee benefit providers, banks and credit unions and membership organizations. A subsidiary of IDT911, IDT911 Consulting™ provides information security and data privacy services to help businesses avert or respond to a data loss incident. Together, the companies provide preventative and breach response services to more than 770,000 businesses in the United States, Canada and the United Kingdom. IDT911 is the recipient of several awards, including the Stevie Award for Sales and Customer Service and the Phoenix Business Journal Tech Titan award for innovation in breach and fraud-fighting services. The company is the organizer of the Privacy XChange Forum, an annual conference that brings together high profile privacy thought leaders. For more information, please visit http://www.idt911.com, http://www.idt911consulting.com, http://www.facebook.com/idt911 and http://www.twitter.com/idt911.
February 1st 2015 Posted at Uncategorized Comments Off on Knowledgent Implements First-of-Its-Kind Healthcare Enterprise Data Lake
Warren, NJ (PRWEB) January 08, 2015
Knowledgent, the data and analytics firm, today announced a first-of-its-kind data lake for a major healthcare enterprise. This data and analytics solution enables both business analysts and data scientists to access a wide variety of structured and unstructured datasets in a consolidated data management environment for agile, advanced analytics.
Knowledgent was engaged by this healthcare enterprise to develop and implement a data and analytics strategy that addressed the enterprise’s challenge around decentralized data management, which had led to a lack of governance and control, causing extensive resource misalignment. Knowledgent partnered with Hortonworks®, the contributor to and provider of enterprise Apache™ Hadoop®, for the implementation of a next-generation data warehouse and analytics environment, leveraging a Hadoop-based data lake. This engagement empowered business users and data scientists to rapidly access and analyze structured, semi-structured, and unstructured data for advanced analytics. A case study about this engagement can be found on the Knowledgent blog.
“Our partnership with Hortonworks was a critical enabler of the Hadoop-based data lake,” said Matthew Arellano, Healthcare Portfolio Partner at Knowledgent. “As a result, business analysts and data scientists within the enterprise can focus their talents on producing better and faster models for such analytics as member retention, utilization management, Medicare STARS, and risk adjustment.”
“The development of industry-specific applications is a natural next step in the evolution of the Hadoop market and we look forward to working with Knowledgent to provide data-driven applications for their customers,” said John Kreisa, Vice President of Strategic Marketing at Hortonworks. “Our industry-leading Hadoop distribution integrated seamlessly with the data lake solution that, powered by Knowledgent’s expertise in data and analytics, will help users maximize the value from their data.”
About Knowledgent
Knowledgent is a data and analytics firm that helps organizations transform their information into business results through data and analytics innovation. Our expertise seamlessly integrates industry experience, data analytics and science capabilities, and data architecture and engineering skills to uncover actionable insights.
Knowledgent operates in the emerging world of big data as well as in the established disciplines of enterprise data warehousing, master data management, and business analysis. We have not only the technical knowledge to deliver game-changing solutions at all phases of development, but also the business acumen to evolve data initiatives from ideation to operationalization, ensuring that organizations realize the full value of their information.
January 28th 2015 Posted at Uncategorized Comments Off on Tech Pioneers Predict the Future of Lifestyle Technology in 2015
New York, NY (PRWEB) January 03, 2015
Living in Digital Times, a series of conferences and exhibits at the 2015 International CES®, brings together the newest innovations and leading lifestyle technology experts with over 200 exhibitors and over 300 conference speakers this year. From kids that code, to high tech education, to family entertainment and tools, fitness and health and the growing senior dependency on technology, it’s clear that this year’s most innovative products at CES target consumers wanting a more digital lifestyle.
LIDT asked some top conference speakers to weigh in on the future of technology posing this question: What is the greatest impact technology will have on your field in 2015?
Advances in Technology
“Everyone in America will either have their own start up or they’ll be replaced by robots and just spend 2015 on the beach. Seriously, we can expect more collaborative maker environments and more startups doing great things. I also expect an upswing in brick and mortar stores because people are hungry for physical experiences, not just clicking. The electronic wallet and alternative forms or payment will begin to go mainstream and 2015 will be remembered as the year of the wearable.” – Robin Raskin, founder and president of Living in Digital Times
“The International CES will continue to be the place where companies big and small and from all facets of technology come together to share a common bond, a belief in innovation to help make the world a better place. In 2015, we’ll see gadgets coalesce into ecosystems to ensure even greater connectivity. Whether it’s your car, your home, or your mobile life, all of the various technologies you own will talk to each other to make our lives seamless, integrated and safer. We’ll see a new generation of products that just work.” – Gary Shapiro, president and CEO, Consumer Electronics Association (CEA) ®
Advances in Kids Tech
“2015 will be ‘more more more’ especially more apps made by small, smart, extremely motivated publishers. They’ll help Apple further dominate the multi-touch space, further crowding out video game and toy options. The great screen debate will mature in 2015, shifting from ‘no screens’ to ‘which apps’ and the effort to help children become coders and makers will be made easier thanks to more DIY robots and various types of kits. The hype surrounding 3D printing will shift to something worth getting excited about in 2015 — 3D Goggles.” – Warren Buckleitner, editor, Children’s Technology Review; Moderator, FamilyTech
“The toy Industry looks to technology predominantly for trends. These trends typically filter down to children over time. The toy industry also looks to technology to execute new and old features in less expensive and more exciting ways. However, I believe that the greatest impact technology has on our industry is the competition that comes from it, which ultimately takes away from the sales of traditional toys. That’s why it’s important for the toy industry to remember that, as much as technology changes, advances and continues to blow our minds, we need to not lose sight of the fact that a child still needs to feel that there’s a beating heart. I believe that the toy Industry understands this in profound ways that are unique to children.” – Ben Vardi, executive vice president, SpinMaster; Panelist, FamilyTech
Advances in Education
“In a word: results. When we apply the science of learning with personalized technology that engages students and meets their needs, we can better prepare them for success in college and subsequently in the workforce. And by developing technology that supports the digital ecosystems that our schools and universities are cultivating, we can make life easier and efficient for educators, as well.” – David Levin, president and CEO, McGraw-Hill Education; Speaker, TransformingEDU
“Now that higher education has fully recovered from the Spice Girls-esque phenomenon of Massive Open Online Courses (MOOCs), colleges and universities are focusing on using technology to improve the affordability and efficacy of their programs that students and employers actually care about: degrees. Accredited universities are utilizing online delivery to enable competency-based learning, a much more cost effective way to deliver higher education, allowing students to earn degrees for as little as $ 5,000. Other universities are employing adaptive learning and gamification, which promise to significantly improve student outcomes.” – Ryan Craig, managing director, University Ventures; Panelist, TransformingEDU
“In 2015 we will see tremendous strides made in personalized learning. The birth of the ‘adaptive MOOC’ and more ‘Open MOOCs’ will kick start the Webification of education. The Web is the most robust publisher of learning content in the world. The problem is getting that information into a learning environment that’s malleable and supportive of content of all kinds. We’re not there yet, but when it happens it might be the most transformative moment in education in the last 100 years. Publishers will thrive when this happens, too. They will be able to move beyond teaching institutional knowledge and double down on specialty content that only publishers can create, thus building a bigger moat around their business model. In addition, we will see a focus on the personalized learning experience more than the underlying technology. Many have built personalized learning solutions, but where they have failed is making them modular, adaptable, enjoyable experiences. That’s gonna chance and for the better in the coming 12 months.” – Andrew Smith Lewis, founder and executive chairman, Cerego; Panelist, TransformingEDU
Advances in Fitness and Health
“When it comes to social fitness there are three types of consumers. Those who share everything with everyone, those who share directly within their fitness community and those who track fitness data solely for personal development. To truly develop an ecosystem that provides useful information for all three of these audiences will be the challenge in 2015. Keeping existing athletes motivated and encouraging them to expand their social networks to those just starting their fitness journey is the real opportunity for growth.” – Robin Thurston, SVP Digital, Under Armour; Presenter, FitnessTech Summit
“The connected health revolution is transforming our habits by putting us at the front and center of our own health management. In order for technology to truly serve its purpose beyond early adopters, it must be invisible. Wearable needs to become truly wearable with genuine fusion between fashion and technology. Stationary sensors need to be embodied into beautiful hardware that does not come in the way of our daily routine. Upon achieving these principles, connected health devices and the technology inside of them can become true lifelong companions for better health and wellness.” – Cedric Hutchings, CEO, Withings; Speaker, Digital Health Summit
“Within the headphone category and sports headphones in particular you will see a lot more convergence with other wearable technologies. This will not only simplify the daily routine of bringing these health and fitness devices into one’s life but also make for a better value proposition encouraging more to do so.” – Bruce Borenstein, president & CEO, AfterShokz; Panelist, FitnessTech Summit
“2015 will be the year in fitness tech where the conversation moves beyond the latest whiz bang device and evolves beyond the simple reverence for more data. We will and we must start talking in terms of information and advice and acknowledge the role that coaches and trainers, be it virtual or in person, play in the larger equation.” – Michael Yang, managing director, Comcast Ventures; Moderator and Panelist, FitnessTech Summit
“The transformative changes happening across the healthcare landscape, which are being fueled by new players, products, business models, and record amounts of venture capital, have elevated the dialogue about being and staying healthy. Our job in 2015 will be to leverage technology to make health information more understandable and actionable, to improve access and the quality of care, and to help people be better healthcare consumers so they can live healthier lives.” – David Schlanger, chief executive officer, WebMD; Special Guest, Digital Health Summit
Advances in Wearables
“There will be a greater pull for highly functioning – yet aesthetically beautiful – wearables that enable a deeper connection to ourselves, our environment, and each other. Well-designed brain sensors will markedly disrupt the field of wearable technology by democratizing access to inner workings of the brain and behavior, allowing people to do more with their mind than they ever thought possible.” – Ariel Garten, CEO and co-founder of InteraXon; Panelist, FamilyTech
Advances for Seniors and Baby Boomers
“Technology can bring independence and connectedness, which can result in happier lives and offer real possibilities for people as they age. AARP recognizes that Americans 50+ are ready, willing and able to take advantage of emerging technology in 2015. These new technology products will utilize sensors that are more sensitive and less intrusive. Wearables, such as designer-quality connected watches, will surface that easily respond to voice command and address multiple needs around health and fitness, medication management, and emergency response – and also tell time. Products with greater interoperability for the connected home will be among new options. We will also see a growing number of technologies for older adults that link in their caregivers, as well as a growing number of products designed for younger age groups being bought and used by older age groups where ease of use, and plug-and-play, are the standard.” – Jody Holtzman, senior vice president, Thought Leadership, AARP; Speaker, Lifelong Tech Summit
“2015 will see two key transitions. First, the adoption of consumer technologies will play an increasingly important role in connected healthcare due to the efficiency gains through using consumer devices and bring-your-own-device models. Secondly, cost and quality accountability will continue to be a focus for healthcare institutions. These two trends together will drive a transition for connected healthcare as institutions transition from ‘pilotitis’ to ‘standard of care’ for the on-going management of those with chronic conditions and post-acute transitions. This combination, coupled with proven financial sustainability of these models, will help grow these care models in scale.” – Marcus Grindstaff, vice president, US Sales and Global Market Development, Intel-GE Care Innovations™; Speaker, Lifelong Tech Summit
Owned and produced by the Consumer Electronics Association (CEA)®, the 2015 International CES will occur January 6-9, 2015 in Las Vegas, Nevada.
About Living in Digital Times:
Founded by veteran technology journalist Robin Raskin, Living in Digital Times brings together the most knowledgeable leaders and the latest innovations impacting both technology and lifestyle. It helps companies identify and act on emerging trends, create compelling company narratives, and do better business through strong network connections. Living in Digital Times produces technology conferences, exhibits and events at the International CES and other locations throughout the year by lifestyle verticals. Core brands include Digital Health Summit, FitnessTech Summit, Lifelong Tech, Kids@Play Summit, Family Tech Summit, TransformingEDU, MommyTech TV, Wearables and FashionWare runway show, Mobile Apps Showdown, Last Gadget Standing, Battle of the Bands, and the KAPi Awards. The company also works with various foundations and manages the Appreneur Scholar awards program for budding mobile entrepreneurs. For more information, visit http://www.LivinginDigitalTimes.com and keep up with our latest news on Twitter, LinkedIn and Facebook.
About CES:
The International CES is the world’s gathering place for all who thrive on the business of consumer technologies. It has served as the proving ground for innovators and breakthrough technologies for more than 40 years—the global stage where next-generation innovations are introduced to the marketplace. As the largest hands-on event of its kind, CES features all aspects of the industry. And because it is owned and produced by the Consumer Electronics Association (CEA), the technology trade association representing the $ 211 billion U.S. consumer electronics industry, it attracts the world’s business leaders and pioneering thinkers to a forum where the industry’s most relevant issues are addressed. Follow CES online at CESweb.org and through social media: cesweb.org/social.