Find the Best Jobs in Technology & Data

Advance your career! Find real jobs posted by real employers on our award-winning platform.

Job Seekers

Post your resume on the best tech job board and apply to Tech Jobs, Analytics Jobs, and Data Jobs.
Add Resume

Employers

Advertise on the best tech job board to reach top talent for Tech Jobs, Analytics Jobs, and Data Jobs.
Post a Job

Latest Insights

Cloud is an innovative way to manage computing and storage resources. Reduced TCO, agility, and data localization are some of the features that make a compelling case for cloud adoption. It also gives enterprises flexibility to deliver solutions with power to scale up and down as per economic requirements and situations. Cloud is not only about sharing resources but also a shared responsibility model. Multitenancy is synonymous to cloud, i.e. multiple customers (Inter as well as Intra) share the same resource pool. Abstraction and orchestration are the two characteristics that enable cloud to deliver the resources in a segregated and isolated manner. Broadly, scope of security and compliance doesn’t change much with cloud but does introduce complexity in terms of roles and responsibilities between a cloud user and provider with regards to securing different components of the solution. It is highly possible that service models overlap and the resulting project is a combination of IaaS and PaaS. Technologies, tools, and configurations offered by a provider could be different at each stage and dependent on the model finalized. These gaps should be identified as part of architecture design. Governance and risk management Cloud computing has a direct impact on governance and risk management due to the shared resource model. To an organization, a cloud provider should not be treated as another third-party service provider, as in this case, it is not dedicated and may not be feasible for them to fully customize their offerings and legal agreements. Negotiated contracts, supplier assessment and compliance reports are some of the tools to exercise governance. A very good analogy put forth by Cloud Security Alliance is, “Think of a shipping service. When you use a common carrier/provider you don’t get to define their operations. You put your sensitive documents in a package and entrust them to meet their obligations to deliver it safely, securely, and within the expected Service Level Agreement.” Enterprise should be ready to accept that these compliance reports may not be fully accessible as a cloud provider is servicing many customers over the same platform and have reservations in sharing the complete report. Organization should define risk tolerance based on the assets involved and service model agreed. Geographical restrictions and data protection laws Most of these laws and guidelines were developed in the late 1960s and 1970s, later clarified and expanded for OECD. Quite a few countries have mandated that the personal data, or as defined in regulations, should not move out from their respective geographical boundaries. Cloud providers should explicitly document location of user, infra location, data classification, and any other restriction involved. At times, these cross-location requirements could be conflicting and difficult to manage. ‘Privacy by design’ should be the guiding principle for defining any product or service. Without any restrictions and guidelines, data may get replicated easily in multiple pockets, and hence, practically difficult to identify and delete. Data security, encryption, and migrations Data security depends upon the location of data, its classification, storage format, applicable access controls and encryption tools, and technologies used. The most common types of data storage over cloud are object/file-based, volume, and database (relational/NoSQL). One more framework in use is data dispersion to help break down data in small parts and store multiple copies on different physical storage. Sending data to cloud object storage via APIs is considered relatively reliable and cost effective as compare to setting up a dedicated SFTP server. Architecture should also include tools to detect data transfer or large data migration. Cloud Access and Security Brokers (CASB) and Data Loss Prevention (DLP) tools help detect large data migrations and network monitoring. Some are capable of security alerting, as well. Securing data in-motion is an important aspect attached to cloud computing. Few options for encrypting in-transit data are client-side encryption, network encryption (TLS/SFTP), and proxy-based encryption. Design and architecture should have an appetite to accept public data as that may be one of the expectations from the solution. Design should have capability to isolate and scan the data before integrating it with the primary data store. Key management is tightly coupled with these choices and can be implemented based on Hardware Security Module (HSM), cloud provider specific virtual appliance, or hybrid (a combination of HSM and virtual appliance). Similarly, encryption and tokenization are two techniques used to manage data-at-rest. The methods and techniques may vary based on service model, provider, and deployment. It may be easy to adopt a blanket encryption policy, but we should understand that data processing over encrypted data is going to increase the compute time. We’ve discussed various options around data encryption, but as a guideline, cloud application architecture should be defined with threat model as an input. We should document key exposing mechanism, location of encryption engine, etc. One should take note of cloud provider capabilities as input to application architecture and assess native security choices offered by cloud provider as, at times, it may not only be better but also be cost effective by not re-inventing the wheel. Moving to cloud should be considered as an opportunity to define better ways to process and manage data. (References: Cloud Security Alliance’s " Security Guidance for Critical Areas of Focus in Cloud Computing v4.0 ") Article written by Akshey Gupta Image credit by Cloud Security Alliance  Want more? For Job Seekers | For Employers | For Influencers
Half of startups have no women on their leadership team, according to Silicon Valley Bank's Women in Technology Leadership 2019 report based on survey responses from technology and healthcare executives in major innovation hubs. While the annual report finds that there is some progress, a lack of gender parity persists. Just 56 percent of startups have at least one woman in an executive position, and only 40 percent have at least one woman on the board of directors.  It was found that 59 percent of startups have some type of program in place designed to increase the number of women in leadership — and that the founding team’s gender often determines which executive roles women hold at startups. "We have measured gender parity in startup leadership since 2014 and the numbers continue to be concerning. We must do better," said Greg Becker, CEO of Silicon Valley Bank. "There is, however, a bright spot in that startups are recognizing the pressing need to be more proactive; 59 percent now have programs in place to help close the gender gap. While there is still a great deal of work to be done, we believe that the innovation economy is making progress and sees that increasing gender diversity as an important way to attract skilled talent, one of the biggest challenges facing startups," said Becker. The report measures the percentage of women in leadership positions and compares startups' views of the innovation economy based on the gender of their founders. Survey findings include: 28 percent of startups have at least one woman on the founding team. Founder gender often determines women's roles. Just five percent of startups with only men on the founding team have a female CEO, and they are much more likely to have women leading HR and marketing. 59 percent of startups have programs in place designed to support gender diversity, the highest percentage we have seen since the report's inception in 2014. Raising capital is hard for all startups, and even more challenging for companies with at least one female founder. 87 percent of companies with at least one female founder describe the fundraising environment as somewhat or extremely challenging compared to 78 percent of all-male founding teams. Startups with at least one female founder are more likely to engage with small investors. When it comes to gender-based hiring goals, 24 percent of startups have company-wide hiring and promotion goals, seven percent have goals for C-level positions only, and 17 percent have goals to add female board members. Startups were also asked to describe the programs they have in place to support gender diversity. The most common programs include creating a flexible work environment, recruiting/interview techniques, and leadership development. "Over the past couple of years at Gapsquare, we have seen tech companies using data to narrow gaps," said Dr. Zara Nanu, CEO and Co-Founder of Gapsquare, a UK-based big data company using cloud-based software to analyze and narrow the gender pay gap, as well as building equality and diversity into company practices. "There are a few initiatives that are already providing results. Some of the most successful initiatives we have seen focus on providing flexible working for all employees, and finding new and innovative ways to think about hiring new talent as well as promoting employees within the company. We have also seen progress in companies which analyze all of the reward and compensation elements and restructure them to ensure they benefit everyone," said Nanu. About the Women in Technology Leadership 2019 report This report is part of the 10th anniversary edition of SVB's Startup Outlook Report, which is based on a survey of 1,400 technology and healthcare startup founders and executives primarily in the US, the UK, China, and for the first time, Canada. Follow the conversation on Twitter at @SVB_Financial and with #StartupOutlook.  Download and read the full report . Article published by icrunchdata Image credit by Silicon Valley Bank Want more? For Job Seekers | For Employers | For Influencers
Augmented analytics, continuous intelligence, and explainable artificial intelligence (AI) are among the top trends in data and analytics technology that have significant disruptive potential over the next three to five years, according to Gartner, Inc. Rita Sallam, research vice president at Gartner, said data and analytics leaders must examine the potential business impact of these trends and adjust business models and operations accordingly, or risk losing competitive advantage to those who do. “The story of data and analytics keeps evolving, from supporting internal decision making to continuous intelligence, information products, and appointing chief data officers,” she said. “It’s critical to gain a deeper understanding of the technology trends fueling that evolving story and prioritize them based on business value.” According to Donald Feinberg, vice president and distinguished analyst at Gartner, the very challenge created by digital disruption — too much data — has also created an unprecedented opportunity. The vast amount of data, together with increasingly powerful processing capabilities enabled by the cloud, means it is now possible to train and execute algorithms at the large scale necessary to finally realize the full potential of AI. “The size, complexity, distributed nature of data, speed of action, and the continuous intelligence required by digital business means that rigid and centralized architectures and tools break down,” Mr. Feinberg said. “The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change.” Gartner recommends that data and analytics leaders talk with senior business leaders about their critical business priorities and explore how the following top trends can enable them. Trend No. 1: Augmented Analytics Augmented analytics is the next wave of disruption in the data and analytics market. It uses machine learning (ML) and AI techniques to transform how analytics content is developed, consumed, and shared. By 2020, augmented analytics will be a dominant driver of new purchases of analytics and BI, as well as data science and ML platforms, and of embedded analytics. Data and analytics leaders should plan to adopt augmented analytics as platform capabilities mature. Trend No. 2: Augmented Data Management Augmented data management leverages ML capabilities and AI engines to make enterprise information management categories including data quality, metadata management, master data management, data integration, as well as database management systems (DBMSs) self-configuring and self-tuning. It is automating many of the manual tasks and allows less technically skilled users to be more autonomous using data. It also allows highly skilled technical resources to focus on higher value tasks. Augmented data management converts metadata from being used for audit, lineage, and reporting only, to powering dynamic systems. Metadata is changing from passive to active and is becoming the primary driver for all AI/ML. Through to the end of 2022, data management manual tasks will be reduced by 45 percent through the addition of ML and automated service-level management. Trend No. 3: Continuous Intelligence By 2022, more than half of major new business systems will incorporate continuous intelligence that uses real-time context data to improve decisions. Continuous intelligence is a design pattern in which real-time analytics are integrated within a business operation, processing current and historical data to prescribe actions in response to events. It provides decision automation or decision support. Continuous intelligence leverages multiple technologies such as augmented analytics, event stream processing, optimization, business rule management, and ML. “Continuous intelligence represents a major change in the job of the data and analytics team,” said Ms. Sallam. “It’s a grand challenge — and a grand opportunity — for analytics and BI (business intelligence) teams to help businesses make smarter real-time decisions in 2019. It could be seen as the ultimate in operational BI.” Trend No. 4: Explainable AI AI models are increasingly deployed to augment and replace human decision making. However, in some scenarios, businesses must justify how these models arrive at their decisions. To build trust with users and stakeholders, application leaders must make these models more interpretable and explainable. Unfortunately, most of these advanced AI models are complex black boxes that are not able to explain why they reached a specific recommendation or a decision. Explainable AI in data science and ML platforms, for example, auto-generates an explanation of models in terms of accuracy, attributes, model statistics, and features in natural language. Trend No. 5: Graph Graph analytics is a set of analytic techniques that allows for the exploration of relationships between entities of interest such as organizations, people and transactions. The application of graph processing and graph DBMSs will grow at 100 percent annually through 2022 to continuously accelerate data preparation and enable more complex and adaptive data science. Graph data stores can efficiently model, explore, and query data with complex interrelationships across data silos, but the need for specialized skills has limited their adoption to date, according to Gartner. Graph analytics will grow in the next few years due to the need to ask complex questions across complex data, which is not always practical or even possible at scale using SQL queries. Trend No. 6: Data Fabric Data fabric enables frictionless access and sharing of data in a distributed data environment. It enables a single and consistent data management framework, which allows seamless data access and processing by design across otherwise siloed storage. Through 2022, bespoke data fabric designs will be deployed primarily as a static infrastructure, forcing organizations into a new wave of cost to completely re-design for more dynamic data mesh approaches. Trend No. 7: NLP/ Conversational Analytics By 2020, 50 percent of analytical queries will be generated via search, natural language processing (NLP) or voice, or will be automatically generated. The need to analyze complex combinations of data and to make analytics accessible to everyone in the organization will drive broader adoption, allowing analytics tools to be as easy as a search interface or a conversation with a virtual assistant. Trend No. 8: Commercial AI and ML It is predicted that by 2022, 75 percent of new end-user solutions leveraging AI and ML techniques will be built with commercial solutions rather than open source platforms. Commercial vendors have now built connectors into the Open Source ecosystem and they provide the enterprise features necessary to scale and democratize AI and ML, such as project and model management, reuse, transparency, data lineage, and platform cohesiveness and integration that Open Source technologies lack. Trend No. 9: Blockchain The core value proposition of blockchain, and distributed ledger technologies, is providing decentralized trust across a network of untrusted participants. The potential ramifications for analytics use cases are significant, especially those leveraging participant relationships and interactions. However, it will be several years before four or five major blockchain technologies become dominant. Until that happens, technology end users will be forced to integrate with the blockchain technologies and standards dictated by their dominant customers or networks. This includes integration with existing data and analytics infrastructure. The costs of integration may outweigh any potential benefit. Blockchains are a data source, not a database, and will not replace existing data management technologies. Trend No. 10: Persistent Memory Servers New persistent-memory technologies will help reduce costs and complexity of adopting in-memory computing (IMC)-enabled architectures. Persistent memory represents a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads. It has the potential to improve application performance, availability, boot times, clustering methods and security practices, while keeping costs under control. It will also help organizations reduce the complexity of their application and data architectures by decreasing the need for data duplication. “The amount of data is growing quickly and the urgency of transforming data into value in real-time is growing at an equally rapid pace,” Feinberg said. “New server workloads are demanding not just faster CPU performance, but massive memory and faster storage.” Upcoming Data & Analytics Summits Gartner Data & Analytics Summits 2019 will take place March 18-21 in Orlando, May 29-30 in Sao Paulo, June 10-11 in Mumbai, September 11-12 in Mexico City and October 19-20 in Frankfurt. Follow news and updates from the events on Twitter using #GartnerDA. Article published by icrunchdata Image credit by Getty Images, Moment, Juhari Muhade Want more? For Job Seekers | For Employers | For Influencers
View All Insights