Discover the Best in Technology and Data

The world's leading brands post jobs and advertise on icrunchdata's award-winning platform.

Job Seekers

Post your resume and get noticed
by industry-leading companies.
Add Resume


Advertise your job to reach the
best talent in Techology and Data.
Post a Job

Latest Insights

The European Union (EU) General Data Protection Regulation (GDPR) took effect May 25, 2018, yet only 34.5 percent of nearly 500 professionals involved in GDPR compliance efforts say their organizations can defensibly demonstrate compliance with the new data privacy rules as of today, according to a recent Deloitte poll. Litigation, regulatory, and internal investigation challenges could abound for others. One-third of respondents (32.7 percent) hope to be compliant within 2018. And, 11.7 percent plan to take a "wait and see" approach amid uncertainty over how EU regulators in various countries will enforce the new regulation. "The fact that the GDPR effective date has come and gone and many are still scrambling to demonstrate a defensible position on GDPR compliance reflects the complexity and challenges as the world of privacy rapidly changes," said Rich Vestuto, a Deloitte Risk and Financial Advisory managing director in discovery for Deloitte Transactions and Business Analytics LLP. Third-party contract management for GDPR compliance Only 13.6 percent of respondents are confident that their organizations know what data third parties have and are leveraging AI and other technologies to analyze and manage third-party contracts for GDPR compliance. A majority (56 percent) aren't done discerning what data third parties have or the potential implications of GDPR on third-party contract management. Some (10.2 percent) have yet to begin addressing third-party GDPR compliance at all. Vestuto added, "Among the biggest GDPR compliance challenges is third-party contract management. Under GDPR, organizations are responsible for ensuring privacy protection of EU-regulated data shared with or used by vendors and service providers, which requires those organizations to know who their vendors are and precisely what data those third parties hold. Updating or renegotiating contracts and agreements may help ensure third parties are GDPR-compliant when using your organization's EU-regulated data." Discovery challenges loom for 30 percent Discovery will be harder for their organizations now that the GDPR is enforceable, according to 30.6 percent of respondents. Surprisingly, 18.6 percent expect discovery to actually become easier under GDPR. Some (17.2 percent) expect no change to their organizations' discovery practices, as a result of GDPR taking effect. "Even those professionals closely involved in GDPR compliance may not fully appreciate the implications the new rules may have for discovery related to regulatory inquiry responses, litigation and internal investigation proceedings – as well as other aspects of their businesses," Vestuto cautioned. Scalability is key as more jurisdictions add data privacy rules Nearly half of respondents (48.2 percent) say their organizations' data privacy programs are scalable to address pending rules in other jurisdictions even if their immediate focus is GDPR. Also, 19.8 percent report that their organizations' programs are focused solely on GDPR without scalability, potentially leaving them unprepared to deal with new rules elsewhere. Vestuto concluded, "Other jurisdictions beyond the EU are enacting more stringent data privacy protections. Data privacy programs should be scalable and requirements rationalized on a global basis to ensure that organizations are able to address current and pending rules in various jurisdictions as needed." About the online poll On June 22, 2018, a Deloitte Dbriefs webcast titled "EU General Data Protection Regulation: practical steps for compliance" polled more than 490 professionals involved in their organizations' General Data Protection Regulation (GDPR) compliance efforts. Answer rates differed by question. Article published by Anna Hill Image credit by Getty Images,  E+, gremlin Want more? For Job Seekers | For Employers | For Influencers
Increasingly, CIOs say to me that the value of what they are doing is instantiated through data and analytics. But how do you build an analytics capability that works for the business as a whole? This is the question that I asked recently of the #CIOChat. Their answers should have value to everyone involved in the data or analytics. Should analytics be managed for the business or for operating groups? CIOs had a variety of opinions regarding the operating models for analytics teams. Some believe they should depend upon a company and its industry. These CIOs suggest that a decentralizing analytics organization is best. In addition, they believe that the future of IT and of your business is very, very distributed. For this reason, they want to bring about self-service capabilities and allow subject matter experts to make data and analytics actionable. They suggest, interestingly enough, that what should possibly be centralized is data modeling and machine learning because these skills are hard to acquire and keep. Other CIOs suggest that for larger and more federated organizations, analytics should be a distributed function. These CIOs believe – as Tom Davenport suggested in “ Analytics at Work ” – that analytics should be managed for the entire business. They stress continuous effort is needed to leverage data but believe there is value in embedding data scientists into every business unit. In general, these CIOs say analytics teams should be managed for the entire business to ensure maximum ease of sharing and improvement of data collection and data security. Importantly, they argue that data integrity and cleanliness need to be owned by a central team while the entire organization should be empowered to leverage analytics. They suggest that most analytics have been historically app specific. Yet, they say analytics needs to become a core competency of organizations. Tom Davenport argues in his book for a single corporate team that is farmed out to projects for the enterprise and its business units. This prevents what some CIOs worry about – a siloed analytics organization chart. When this occurs, there is no telling how much effort is duplicated in the analytics space or how many standards are followed. CIOs suggest that it is important to understand that distributed and silos are not the same. You can be distributed and still leverage knowledge across organizations. CIOs suggest, for some, another way of solving the issue is to establish an analytics enablement and governance group that helps coordinate decentralized efforts around the organization. The goal here should be to share knowledge, costs, know-how, and toolkits while tackling shared goals. CIOs say this kind of “Center of Excellence” thinking can accelerate business results. Analytical maturity CIOs shared openly that everything they do has to do increasingly with analytics. So, it is essential to have the right context when understanding where to go with analytics. Some CIOs say that they are building a group that will help to manage analytics across their organization, but pockets of specialized expertise remain. This appears to be in between stage 3-4 in Tom Davenport’s Maturity Model – stage 3 establishes “governance of technology and architecture for analytics” and stage 4, “manages analytical priorities and assets at an enterprise level” (“Analytics at Work”, Tom Davenport, page 53). CIOs insist for distributed analytics to work, there needs to be alignment on data governance, tools, and solutions. Otherwise, there will be multiple versions of truth, and ergo confusion. Making the problem more difficult, CIOs say they are seeing more apps and infrastructure with their own embedded analytics. Regardless, CIOs say the trick is to leverage data and analytics across all business workflows. Distributed analytics must align how data and analytics are governed. While some CIOs cringe at the notion of ‘data ownership', they believe the need for role-based permissions (a component of a data strategy) for data access. CIOs suggest that data husbandry is critical and few have an effective strategy. It is critical that you define "owner" in the context of data governance. As well, there should be no duplicate owners – only users working from trusted data sources. They see access as an analytic enabler. Appropriate corporate policy needs to establish acceptable use of the data within and external to organizations. CIOs say that they believe the real value from analytics comes through the integrity of the data and having an enterprise data strategy regardless of where the analytics teams live. To what extent do organizations have an enterprise data strategy? At minimum this should include policy, standards, definitions, models, migration, integrations, security, and access control. Data strategy needs to drive analytics. Which business questions should analytics be focused upon? CIOs suggest analytics and data should be focused on real-time producing results and answers that propel the business forward. Delighting the customer should be a significant use-case. However, CIOs felt the right answers are related to the business questions that an organization is trying to solve whether customer or general line-of-business operations. Some CIOs say the future is here, however, unevenly distributed. They see increasing focus on operations, especially finance. Other CIOs say that many organizations think customer experience is only about reporting and analytics. Ugly dashboards and reports cannot be the endpoint as it has become clear that winning customer experience is a much bigger thing. Is this an area for CIO influence? CIOs suggest that the big question is how capable and how applicable built in analytics are. They say that many software solutions include descriptive analytics and a mix of advanced analytics. But since analytics is not the business of these vendors, they do not tend to be very good or flexible. These CIOs think businesses need to apply analytics in marketing, sales, and customer-facing product and services. Ops, they say, has already had a lot of reporting tools and long history of finding inefficiencies. CIOs need to be able to strategize, deliver, and influence effectively regardless of distributed, federated, centralized, or hybrid models. Actionable data should inform continuous process improvement which every business unit or line of business should use for decision support, prioritization, and allocation of resources. What types of analytical approaches are most prevalent? CIOs are candid that there is a lot of dashboarding with predictive analytics. Lots, they say, are trying to do text mining. They say financial services organizations are the main organizations doing time series; nevertheless, the analytical approach needs to be appropriate to the situation. One CIO said that they haven’t seen very many organizations doing streaming or real-time analysis. They most often see descriptive analytics; however, they stress that many organizations are still early on their analytics journeys. According to the CIOs, success with analytics requires experts in analytical approaches (data scientists) and experts in process improvement. They emphasize that the wrong approach can lead to bad insights and believe that analytics need to be about leveraging business outcomes as their compass. They stress that openness is the value that they want around access for the data and analytics that are created. In other words, they want “Information Democracy”, a term coin by Bernard Liataud , the former CEO of Business Objects. With regards to questions about access, role-based permissions, CIOs want these defined by business process owners. Increasingly, CIOs see little value in using gut feel and historical data. They are clear that historical data is less interesting than real-time data. However, they suggest that some historical trends matched against real-time data provide insight opportunities depending upon the industry and business functions. Many CIOs say that their organizations are moving to SaaS solutions that provide a data store, model analytics, comparative data, and an analytical expertise all at a single price. Further, they believe that many organizations do not have the technical chops to do real time or streaming analytics at scale or to even act from real-time analytics. Distinguishing factors CIOs suggest that driving value is possible when those most knowledgeable with data and the nuances of the data ensure that valid business questions are being asked and answered. It is essential that data be designed so it can roll up into a strategic view. CIOs, however, worry about analytical leaders in the space who are driving a desire to emulate Google and Walmart. CIOs say this is a necessary evolution for the effective data use, and businesses that don't figure out how to leverage analytics for competitive advantage will be choking on data dust and fumes soon. For this reason, it's becoming a cost of doing business, not really distinguishing on its own – what distinguishes a business is what the leadership does with higher velocity data and information. At the same time, CIOs are candid about the challenges. They say that the volume of accumulated data is getting so large that batch processing is an increasing a problem. CIOs say many organizations are trying to discern what questions to ask, what questions add value, and how to identify a question that leads to actionable results. If you aren't thinking about strategic differentiation, you will be disrupted quickly.  It's a key issue and strategic differentiator for us moving forward. One CIO suggested that new business models and desire to target different markets and customer personas are driving the need for data and analytics. These are baseline capabilities before organizations can really do IoT, AI, or machine learning. In sum, CIOs say analytics capability is table stakes. It's a negative differentiator if missing, neutral if have a market-appropriate baseline, and positive if producing advanced insights beyond competition. Parting remarks Much has changed with CIOs and analytics in the last few years. I remember a discussion a few years back where CIOs want to stay far away from data governance. With data and analytics now central to the IT mission and the emergence of the CDO function, I expect the importance of data and analytics to grow. Additional content Why business winners are data driven Creating a data-driven enterprise The dos and don’ts of data lakes 3 capabilities that will propel big data past the ‘trough of disillusionment’ Article written by Myles Suer Image credit by Getty Images,  DigitalVision Hiroshi Watanabe Want more? For Job Seekers | For Employers | For Influencers
The inventor of the first neurocomputer, Dr. Robert Hecht-Nielsen, defines a neural network as – "...a computing system made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs.” Artificial neural networks are one of the most important tools used in machine learning. They are brain-inspired systems which are designed for reproducibility and the repetition of a test or complete experiment that we learn. Neural networks help us cluster and classify. Typically, a neural network is initially trained, or fed large amounts of data. They are excellent tools for finding patterns which are far too complex or numerous for a human programmer to extract and teach the machine to recognize. Neural networks use several principles, including gradient-based training, fuzzy logic, genetic algorithms, and Bayesian methods. They can also be described by the number of hidden nodes the model has or in terms of how many inputs and outputs each node has. Even more crucial are an AI’s inferences – the things you don’t know that it knows about you. For instance, take the case of machine learning and the missing field analysis. We learn from matchmaking analysis of a leading portal dealing in the right matrimonial matches by determining what we know about a person and trying to create a mental frame and a picture of the person. By co-relating the missing fields, it enables us to get better outcomes. However, it will also have repercussions and will have the risk of negative outcomes and may eventually witness the users quitting and leaving if things go wrong. In his article, “ The tough black box choices with algorithmic transparency in India ,” Aditya Talwai of describes that: "MIT Technology Review called this reliance on alchemy over understanding AI’s 'dark secret.' Take, for example, the deep neural net – a machine learning method which produces some of the most uncannily accurate automated decisions. Labeled data is fed into a deep neural net and passed through layers and layers of computation before an output is reported. If the output doesn’t match the expected label, the neural net is instructed to work backwards and tweak its parameters. Eventually, the system will tune itself to arrive at the desired output – say, identifying a dog as a dog. It is an arcane combination of functions and weights, but it is incredibly good at identifying other pictures of dogs... To be sure, not all automated decision-making systems are as hard to interpret as deep neural nets. Less sophisticated rule-based models, like decision trees, can in fact yield legible explanations for their decisions, but whether companies will part with this information willingly is another story. Leaving aside concerns of manipulation and ‘gaming’, the heuristics baked into a company’s automated decision-making form a core part of their intellectual property. Even more crucial are an AI’s inferences – the things you don’t know that it knows about you." Artificial neural network applications As we have already seen the introduction to artificial neural networks, let us now take a look at the major artificial neural network applications. Handwriting Recognition – An important tool for handheld devices like Palm Pilot and for recognizing the handwritten characters by neural networks. Traveling Salesman Problem – Can be solved by neural networks; however, there remains a limitation, and it is by certain degree of approximation only. Image Compression – Massive amounts of images are received and processed at the same time by neural networks, allowing more sites to take advantage of image compression.   Stock Exchange Prediction – Perhaps the most important and critical application. Many factors determine how stocks will behave whether they go up or down on any given day. Neural networks can utilize the information and predict the outcome of any stock on an assigned day. Toward a working class AI In an excerpt from his article, “ You're Using Neural Networks Every Day Online – Here's How They Work ,” Jamie Condliffe reports the following: "...If neural networks are developing so rapidly, is the sky the limit? 'You can certainly expect to see major improvements in image and speech recognition in the coming years,' says Professor Charles Cadieu, a Research Affiliate at MIT, pointing out that these modern neural networks have only really been around for a couple of years anyway. As for language processing, it’s less clear that neural networks will be able to deal with the problems so well. While image and speech recognition definitely work in the layered way that modern neural networks do, there’s less neuroscientific evidence to suggest that language is processed in the same way, according to Cadieu. That may mean that artificial language processing will soon run into conceptual barriers. One thing is clear, though: these kinds of artificial intelligences are already lending a huge helping hand to humans. In the past, you had to sift through your photographs to compile an album from your latest vacation or to find that pic of your buddy Bob drinking a beer. But today, neural network software can do that for you. Google Photos prepares albums automatically, and its smart search function will find images with alarming accuracy. And this kind of consumer-focused software is a mere gimmick compared to the feats that neural networks could one day perform for us. It’s not hard to imagine image-processing algorithms gaining enough intelligence to vet medical images for tumors, with doctors merely checking their result. Voice recognition systems could become so advanced that telemarketing campaigns will be run by software alone. Language processing networks will allow news stories to be written by machine. In fact, all these things are already happening to some extent. The changes are profound enough that researchers at the University of Oxford estimate that up to a half of jobs, including the one possessed by yours truly, will be lost to AI systems powered by neural networks in coming years. But shifts in economies and employment have been driven by technology many times before, from the printing press and motor car, to computers and the internet. Though social upheaval will arise, so too will benefits. Ultimately, neural networks will give everyone access to intelligence that currently lies in the hands of a few. And that will lead to smarter systems, better services, and more time to solve the human problems that computers will never be able to fix." The growth and rapid expansion of the internet is largely responsible for the massive amounts of data being generated and distributed through the internet. We are using neural nets and ML through facial recognition, image processing and searching, real-time language translation, medicine, healthcare, weather – to name just a few. Whatever the future of machine learning and AI is, it will depend to a great extent on advances in cognitive sciences. The more personal data these algorithms are fed, the better they understand a user’s profile, enabling the ability to spot potential anomalies earlier on. Neural networks, which are pivotal and crucial, use a network of nodes (which act like neurons) and edges (which act like synapses) to process data. Through the system the inputs are generated and then run through to generate a series of output in the system. It must be remembered that neural networks aren’t the right solution for everything, but they excel at dealing with complex data. According to Eric Ravenscraft in his article, " What Neural Networks, Artificial Intelligence, and Machine Learning Actually Do ," Google and Microsoft using neural networks to power their translation apps is legitimately exciting because translating languages is hard. We’ve all seen broken translations, but neural network learning could let the system learn from correct translations to get better over time. We’ve seen a similar thing happen with voice transcription." In fact, some companies will be able to develop powerful neural networks that do really comprehend things and make things better. In his article " Google says machine learning is the future. So I tried it myself ," Alex Hern stated that "when Google made TensorFlow open to anyone to use, it wrote: 'By sharing what we believe to be one of the best machine learning toolboxes in the world, we hope to create an open standard for exchanging research ideas and putting machine learning in products.' And it’s not alone in that – every major machine learning implementation is available for free to use and modify, meaning it’s possible to set up a simple machine intelligence with nothing more than a laptop and a web connection."  Neuroscientists along with software engineers are becoming parts of multidisciplinary teams in large corporations to design products and services. Neural networks, artificial intelligence, and machine learning all describe ways for computers to do more advanced tasks and learn from their environment. While you may hear them used interchangeably by app developers, they can be very different in practice. Article written by Raj Kosaraju Image credit by Getty Images, DigitalVision Vectors, Bubaone Want more? For Job Seekers | For Employers | For Influencers
View All Insights