Palo Alto, CA, United States
Palo Alto, CA, United States

Hortonworks is a business computer software company based in Palo Alto, California. The company focuses on the development and support of Apache Hadoop, a framework that allows for the distributed processing of large data sets across clusters of computers. Wikipedia.

SEARCH FILTERS
Time filter
Source Type

News Article | May 23, 2017
Site: techcrunch.com

TechCrunch is pleased to bring you Alchemist Accelerator‘s demo day. Alchemist is one of those rare programs that focuses on enterprise startups. These aren’t your parents’ enterprise companies. Pitches today will span products that help businesses with crowdfunding, wearables, sustainable farming and managing meetings with the power of AI. Investors and press will hear pitches from 15 enterprise companies. The demos start at 3:00pm PT and are expected to last two hours. You can watch it live here. Nobal Technologies The Imirror is the world’s most advanced interactive mirror, helping retail connect with consumers in the fitting room — where buying decisions are made. Team: Pieter Boekhoff (2016 Startup Canada Entrepreneur of the Year, CIS MRU), Thomas Battle (MBA), Alain Kassangana (Masters Eng). Everykey The revolution of access control, Everykey can unlock your phone, laptop, car, house or any other device when you’re close by, and also log you into your website accounts! Team: John McAfee (founder of McAfee Antivirus), Simon Boag (former president of Chrysler and GM), Chris Wentz (who made more than $100,000 in college selling iPads), Max Simon (started a haunted house in middle school and grew it to millions in annual revenue). Flowzo The fastest internet service powered by locals that helps property owners generate cash flow. Team: Thu Nguyen (5G/Wireless at TELUS/Marvell, Waterloo Engineering), Andrew Ta (managed technology strategy teams at both of the largest Canadian telecoms: TELUS/Bell, Ivey MBA), Chris Yap (shipped fiber chips at Marvell, previous startup focused on 60GHz Hardware, Stanford Engineering). Visage Smart crowdsourcing platform allowing companies to find and engage instantly with diverse talent. Team: Joss Leufrancois (co-founded Aldelia, a $50 million recruitment business established in nine countries), Emmanuel Marboeuf (former technical expert and international speaker in cyberdefense innovation). Amper Helps plant managers in factories increase machine and labor productivity by having real-time access to key metrics of machines. Data is captured using a simple, non-invasive retrofit sensor that monitors across machine types. Team: Akshat Thirani (CS Northwestern University, won the Thiel Summit pitch prize), Phil House (CS Northwestern University), Sachin Lal (CS Northwestern University, won the 2016 MIT Clean Energy prize). Uptime (previously EyePiece) Building a smarter industrial communication system using wearables. Team: Will Schumaker (PhD in optics from Stanford/UMich), Michael Leung (PhD student in EE from Stanford/Waterloo). ContextSmith Enterprise sales and expansion platform that uses AI to maximize revenue for sales and account management teams. Team: Will Cheung (head of customer success at Sequoia-backed startup, CS from CMU), Jochen Bedersdorfer (Intel AI Group, VP Eng of text analytics startup). Edyza High-density Internet of Things for industrial IoT, smart agriculture and other use cases that involve scaling sensors and actuators to thousands in close proximity. Team: Rana Basheer (PhD, principal scientist Broadcom, Garmin), Atul Patel (OneScreen, Jornaya and OptimalSocial). HydroVirga High-sensitivity NMR detection for elemental analysis in water. Our unique IP allows for detection of trace contaminants in real time, using micro NMR technology. Team: George Farquar, PhD (former LLNL), Julie Bowen, PhD (former LLNL), Mark Stephenson (20+ years water industry experience). Privacera Data security platform for enterprises to manage and mitigate risks with sensitive data in one place. Team: Balaji Ganesan (former XA Secure, acquired by Hortonworks), Don Bosco Durai (former XA Secure, acquired by Hortonworks, former Bharosa, acquired by Oracle). Stellic (previously Metis Labs) Student success platform designed to help students organize their college journey and graduate on time. Built by students, for students, this solves growing student retention problem faced by higher education. Team: Sabih (CMU CS ’15), Rukhsar (CMU CS ’15), Jiyda (CMU IS ’15). MeetingSift AI-driven meeting collaboration platform that delivers deep actionable insights from meetings across the enterprise. Team: Alex Bergo, PhD (Expert Collab., NLP, ML), Viil Lid, PhD (Collab. tech., HCI, Analytics). Fuse Next-generation inventory software. In the U.S., $1 trillion, or 20 percent of retail sales, are Lost every year because of stockouts and overstocks. We prevent this loss by algorithmically generating a demand forecast and centralizing it with real-time sales, inventory and procurement data. Team: Anna Tolmach (Wharton, Stanford MBA), Rachel Liaw (Stanford, UCLA, JD Candidate, 2018), Bridget Vuong (Stanford University, MS CS). Moesif Platform that makes sense of the world’s API data to change how APIs are created, debugged and used. Team: Derric Gilling (Intel Xeon Phi CPU Architect, Computer Eng @University of Michigan), Xing Wang (Executive Producer @Zynga, previously Microsoft, Computer Science @MIT). Text IQ Protecting enterprises from high-stakes legal disasters by using AI to identify sensitive, reputationally damaging and privileged documents that frequently get missed by human analysts and attorneys. Team: Apoorv Agarwal (PhD in Computer Science, Columbia, original contributor of IBM Watson), Omar Haroun (JD/MBA, Columbia, 3x founder with recent exit). Deep Relevance Internal fraud monitoring platform that uses behavioral AI to help finance and audit teams prevent employee and vendor fraud. Team: Kiran Ratnapu (VP Risk Technology, Merrill Lynch, MIT, IIT Bombay), Prasanna Kumar (healthcare and machine learning at Epic and startups, IIT Bombay CS). Strypes 3D visualizations powering new digital experiences for consumers across all devices and browsers. Team: Alexa Fleischman (EMC, Box, Boston College), Zack Fleischman (Microsoft Xbox, Zynga, Carnegie Mellon University), Matt Schiller (The Advisory Board, Cornell University). The History Project Empowering organizations to document, share and present institutional knowledge through a visual collaboration tool. Team: Niles Lichtenstein (Harvard, Monitor, Ansible — acquired by IPG, Velti — acquired by Blackstone), Michael Devin (principal tech architect at Frog, DARPA Director’s Medal, GE), Ben Yee (ITP, UX at Gilt Group).


SAN JOSE, Calif., May 25, 2017 (GLOBE NEWSWIRE) -- Robin Systems, a company that is transforming the enterprise with its innovative Robin Cloud Platform (RCP), today announced that it has successfully achieved Hortonworks’ Product Integration Certification for Robin RCP. Robin Systems will integrate the Hortonworks Data Platform (HDP®) with RCP to virtualize applications and simplify application lifecycle management while maintaining a 100 percent open and agile approach to software development. Robin’s customers can now benefit from enhanced integration with HDP, including Apache™ Hadoop®, to increase the elasticity of their system resources and enable an application-aware private cloud and path to hybrid cloud.  By achieving Product Integration Certification, customers can now simplify and accelerate the deployment of Hortonworks Connected Data Solutions with validated integrations between leading enterprise technologies from partners and HDP, the industry’s only 100-percent open source platform built on Apache Hadoop. To certify partner integrations, Hortonworks reviews each product for architectural best practices, validates it against a comprehensive suite of integration test cases and performs benchmarks for scale under varied workloads, while comprehensively documenting every step in the process. Based on container technology, Robin brings application virtualization benefits to distributed, clustered and stateful enterprise applications such as databases and Big Data clusters, enabling high-performance workload consolidation with the agility and flexibility previously available only to micro applications. Robin RCP transforms commodity hardware into a compute, storage, and data continuum that enables: With the integration of HDP, users can now easily scale Hadoop applications up or out with one-click, simplify deployment, manage and consolidate big data workloads across Hadoop clusters with QoS and Bare Metal performance. “In today’s enterprise, where mobility is fueling an ongoing cycle of innovation, ensuring performance predictability without having to add IT resources is now becoming absolutely critical,” said Premal Buch, CEO of Robin Systems. “Providing this integrated solution will allow Robin and Hortonworks to streamline Big Data as a Service-based application development and deployment. Implementing the RCP solution will enable businesses that are running Database-as-a-Service environments with large volumes of traffic to more efficiently develop, virtualize and manage Hadoop applications.” HDP was built by the core architects, builders and operators of Apache Hadoop and includes all of the necessary components to manage a cluster at scale and uncover business insights from existing and new big data sources. HDP enables multiple workloads, applications and processing engines across single clusters with optimal efficiency. It also provides an open platform that deeply integrates with existing IT investments and upon which enterprises can build and deploy Hadoop-based applications. “Hortonworks is dedicated to expanding and empowering the Apache Hadoop ecosystem, accelerating innovation and adoption of Open Enterprise Hadoop,” said Chris Sullivan, vice president of global channels and alliances at Hortonworks. “We are pleased to help make Robin Cloud Platform, the latest Robin solution to achieve HDP Integration Certification, available to our customers as part of this commitment. Joint Webinar with Hortonworks to Provide More Details IT and database administrators are often met with challenges when it comes to deploying right sizing, and the ability to meet seasonal peaks without disrupting availability in a big data and Hadoop deployment. Presenting a joint webinar on May 25 at 11:00 AM PDT (2 PM EDT), executives from Robin Systems and Hortonworks will discuss these operational complexities. Attendees will learn how to overcome such obstacles without adversely impacting application performance, business continuity and overall productivity. The webinar will feature presenters, Eric Thorsen, VP, Industry Solution at Hortonworks, and Deba Chatterjee, Director of Products at Robin Systems.  Register at: http://containers.robinsystems.com/webinar/managing_seasonal_data_peaks Robin is transforming the way enterprise applications drive the infrastructure by bringing together purposely built container aware block storage with application aware fabric into the cloud - private and/or public; demonstrating unique benefits to distributed, clustered and stateful applications featuring Big-Data and Databases. With a team that includes industry veterans from leading enterprise technology companies such as NetApp, Oracle and Veritas, Robin seeks to disrupt the $20 billion-plus virtualization market with its container-based compute and storage platform software that delivers better performance higher consolidation and a much simpler application lifecycle management than traditional hypervisor-based virtualization. Founded in 2013, the San Jose California-based company has raised more than $27 million in venture funding from leading investors such as Clear Ventures DN Capital USAA and SAP’s Hasso Plattner Ventures. Robin Systems, the Robin Systems logo and Robin Cloud Platform for Enterprise Applications and Application-to-Spindle Quality of Service Guarantee are trademarks or registered trademarks of Robin Systems, Inc., and are protected by trademark laws of the United States and other jurisdictions. All other product and company names are trademarks or registered trademarks of their respective companies. Hortonworks, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and its subsidiaries in the United States and other jurisdictions.


SAN JOSE, Calif., May 25, 2017 (GLOBE NEWSWIRE) -- Robin Systems, a company that is transforming the enterprise with its innovative Robin Cloud Platform (RCP), today announced that it has successfully achieved Hortonworks’ Product Integration Certification for Robin RCP. Robin Systems will integrate the Hortonworks Data Platform (HDP®) with RCP to virtualize applications and simplify application lifecycle management while maintaining a 100 percent open and agile approach to software development. Robin’s customers can now benefit from enhanced integration with HDP, including Apache™ Hadoop®, to increase the elasticity of their system resources and enable an application-aware private cloud and path to hybrid cloud.  By achieving Product Integration Certification, customers can now simplify and accelerate the deployment of Hortonworks Connected Data Solutions with validated integrations between leading enterprise technologies from partners and HDP, the industry’s only 100-percent open source platform built on Apache Hadoop. To certify partner integrations, Hortonworks reviews each product for architectural best practices, validates it against a comprehensive suite of integration test cases and performs benchmarks for scale under varied workloads, while comprehensively documenting every step in the process. Based on container technology, Robin brings application virtualization benefits to distributed, clustered and stateful enterprise applications such as databases and Big Data clusters, enabling high-performance workload consolidation with the agility and flexibility previously available only to micro applications. Robin RCP transforms commodity hardware into a compute, storage, and data continuum that enables: With the integration of HDP, users can now easily scale Hadoop applications up or out with one-click, simplify deployment, manage and consolidate big data workloads across Hadoop clusters with QoS and Bare Metal performance. “In today’s enterprise, where mobility is fueling an ongoing cycle of innovation, ensuring performance predictability without having to add IT resources is now becoming absolutely critical,” said Premal Buch, CEO of Robin Systems. “Providing this integrated solution will allow Robin and Hortonworks to streamline Big Data as a Service-based application development and deployment. Implementing the RCP solution will enable businesses that are running Database-as-a-Service environments with large volumes of traffic to more efficiently develop, virtualize and manage Hadoop applications.” HDP was built by the core architects, builders and operators of Apache Hadoop and includes all of the necessary components to manage a cluster at scale and uncover business insights from existing and new big data sources. HDP enables multiple workloads, applications and processing engines across single clusters with optimal efficiency. It also provides an open platform that deeply integrates with existing IT investments and upon which enterprises can build and deploy Hadoop-based applications. “Hortonworks is dedicated to expanding and empowering the Apache Hadoop ecosystem, accelerating innovation and adoption of Open Enterprise Hadoop,” said Chris Sullivan, vice president of global channels and alliances at Hortonworks. “We are pleased to help make Robin Cloud Platform, the latest Robin solution to achieve HDP Integration Certification, available to our customers as part of this commitment. Joint Webinar with Hortonworks to Provide More Details IT and database administrators are often met with challenges when it comes to deploying right sizing, and the ability to meet seasonal peaks without disrupting availability in a big data and Hadoop deployment. Presenting a joint webinar on May 25 at 11:00 AM PDT (2 PM EDT), executives from Robin Systems and Hortonworks will discuss these operational complexities. Attendees will learn how to overcome such obstacles without adversely impacting application performance, business continuity and overall productivity. The webinar will feature presenters, Eric Thorsen, VP, Industry Solution at Hortonworks, and Deba Chatterjee, Director of Products at Robin Systems.  Register at: http://containers.robinsystems.com/webinar/managing_seasonal_data_peaks Robin is transforming the way enterprise applications drive the infrastructure by bringing together purposely built container aware block storage with application aware fabric into the cloud - private and/or public; demonstrating unique benefits to distributed, clustered and stateful applications featuring Big-Data and Databases. With a team that includes industry veterans from leading enterprise technology companies such as NetApp, Oracle and Veritas, Robin seeks to disrupt the $20 billion-plus virtualization market with its container-based compute and storage platform software that delivers better performance higher consolidation and a much simpler application lifecycle management than traditional hypervisor-based virtualization. Founded in 2013, the San Jose California-based company has raised more than $27 million in venture funding from leading investors such as Clear Ventures DN Capital USAA and SAP’s Hasso Plattner Ventures. Robin Systems, the Robin Systems logo and Robin Cloud Platform for Enterprise Applications and Application-to-Spindle Quality of Service Guarantee are trademarks or registered trademarks of Robin Systems, Inc., and are protected by trademark laws of the United States and other jurisdictions. All other product and company names are trademarks or registered trademarks of their respective companies. Hortonworks, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and its subsidiaries in the United States and other jurisdictions.


SAN JOSE, Calif., May 25, 2017 (GLOBE NEWSWIRE) -- Robin Systems, a company that is transforming the enterprise with its innovative Robin Cloud Platform (RCP), today announced that it has successfully achieved Hortonworks’ Product Integration Certification for Robin RCP. Robin Systems will integrate the Hortonworks Data Platform (HDP®) with RCP to virtualize applications and simplify application lifecycle management while maintaining a 100 percent open and agile approach to software development. Robin’s customers can now benefit from enhanced integration with HDP, including Apache™ Hadoop®, to increase the elasticity of their system resources and enable an application-aware private cloud and path to hybrid cloud.  By achieving Product Integration Certification, customers can now simplify and accelerate the deployment of Hortonworks Connected Data Solutions with validated integrations between leading enterprise technologies from partners and HDP, the industry’s only 100-percent open source platform built on Apache Hadoop. To certify partner integrations, Hortonworks reviews each product for architectural best practices, validates it against a comprehensive suite of integration test cases and performs benchmarks for scale under varied workloads, while comprehensively documenting every step in the process. Based on container technology, Robin brings application virtualization benefits to distributed, clustered and stateful enterprise applications such as databases and Big Data clusters, enabling high-performance workload consolidation with the agility and flexibility previously available only to micro applications. Robin RCP transforms commodity hardware into a compute, storage, and data continuum that enables: With the integration of HDP, users can now easily scale Hadoop applications up or out with one-click, simplify deployment, manage and consolidate big data workloads across Hadoop clusters with QoS and Bare Metal performance. “In today’s enterprise, where mobility is fueling an ongoing cycle of innovation, ensuring performance predictability without having to add IT resources is now becoming absolutely critical,” said Premal Buch, CEO of Robin Systems. “Providing this integrated solution will allow Robin and Hortonworks to streamline Big Data as a Service-based application development and deployment. Implementing the RCP solution will enable businesses that are running Database-as-a-Service environments with large volumes of traffic to more efficiently develop, virtualize and manage Hadoop applications.” HDP was built by the core architects, builders and operators of Apache Hadoop and includes all of the necessary components to manage a cluster at scale and uncover business insights from existing and new big data sources. HDP enables multiple workloads, applications and processing engines across single clusters with optimal efficiency. It also provides an open platform that deeply integrates with existing IT investments and upon which enterprises can build and deploy Hadoop-based applications. “Hortonworks is dedicated to expanding and empowering the Apache Hadoop ecosystem, accelerating innovation and adoption of Open Enterprise Hadoop,” said Chris Sullivan, vice president of global channels and alliances at Hortonworks. “We are pleased to help make Robin Cloud Platform, the latest Robin solution to achieve HDP Integration Certification, available to our customers as part of this commitment. Joint Webinar with Hortonworks to Provide More Details IT and database administrators are often met with challenges when it comes to deploying right sizing, and the ability to meet seasonal peaks without disrupting availability in a big data and Hadoop deployment. Presenting a joint webinar on May 25 at 11:00 AM PDT (2 PM EDT), executives from Robin Systems and Hortonworks will discuss these operational complexities. Attendees will learn how to overcome such obstacles without adversely impacting application performance, business continuity and overall productivity. The webinar will feature presenters, Eric Thorsen, VP, Industry Solution at Hortonworks, and Deba Chatterjee, Director of Products at Robin Systems.  Register at: http://containers.robinsystems.com/webinar/managing_seasonal_data_peaks Robin is transforming the way enterprise applications drive the infrastructure by bringing together purposely built container aware block storage with application aware fabric into the cloud - private and/or public; demonstrating unique benefits to distributed, clustered and stateful applications featuring Big-Data and Databases. With a team that includes industry veterans from leading enterprise technology companies such as NetApp, Oracle and Veritas, Robin seeks to disrupt the $20 billion-plus virtualization market with its container-based compute and storage platform software that delivers better performance higher consolidation and a much simpler application lifecycle management than traditional hypervisor-based virtualization. Founded in 2013, the San Jose California-based company has raised more than $27 million in venture funding from leading investors such as Clear Ventures DN Capital USAA and SAP’s Hasso Plattner Ventures. Robin Systems, the Robin Systems logo and Robin Cloud Platform for Enterprise Applications and Application-to-Spindle Quality of Service Guarantee are trademarks or registered trademarks of Robin Systems, Inc., and are protected by trademark laws of the United States and other jurisdictions. All other product and company names are trademarks or registered trademarks of their respective companies. Hortonworks, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and its subsidiaries in the United States and other jurisdictions.


SAN JOSE, Calif., May 25, 2017 (GLOBE NEWSWIRE) -- Robin Systems, a company that is transforming the enterprise with its innovative Robin Cloud Platform (RCP), today announced that it has successfully achieved Hortonworks’ Product Integration Certification for Robin RCP. Robin Systems will integrate the Hortonworks Data Platform (HDP®) with RCP to virtualize applications and simplify application lifecycle management while maintaining a 100 percent open and agile approach to software development. Robin’s customers can now benefit from enhanced integration with HDP, including Apache™ Hadoop®, to increase the elasticity of their system resources and enable an application-aware private cloud and path to hybrid cloud.  By achieving Product Integration Certification, customers can now simplify and accelerate the deployment of Hortonworks Connected Data Solutions with validated integrations between leading enterprise technologies from partners and HDP, the industry’s only 100-percent open source platform built on Apache Hadoop. To certify partner integrations, Hortonworks reviews each product for architectural best practices, validates it against a comprehensive suite of integration test cases and performs benchmarks for scale under varied workloads, while comprehensively documenting every step in the process. Based on container technology, Robin brings application virtualization benefits to distributed, clustered and stateful enterprise applications such as databases and Big Data clusters, enabling high-performance workload consolidation with the agility and flexibility previously available only to micro applications. Robin RCP transforms commodity hardware into a compute, storage, and data continuum that enables: With the integration of HDP, users can now easily scale Hadoop applications up or out with one-click, simplify deployment, manage and consolidate big data workloads across Hadoop clusters with QoS and Bare Metal performance. “In today’s enterprise, where mobility is fueling an ongoing cycle of innovation, ensuring performance predictability without having to add IT resources is now becoming absolutely critical,” said Premal Buch, CEO of Robin Systems. “Providing this integrated solution will allow Robin and Hortonworks to streamline Big Data as a Service-based application development and deployment. Implementing the RCP solution will enable businesses that are running Database-as-a-Service environments with large volumes of traffic to more efficiently develop, virtualize and manage Hadoop applications.” HDP was built by the core architects, builders and operators of Apache Hadoop and includes all of the necessary components to manage a cluster at scale and uncover business insights from existing and new big data sources. HDP enables multiple workloads, applications and processing engines across single clusters with optimal efficiency. It also provides an open platform that deeply integrates with existing IT investments and upon which enterprises can build and deploy Hadoop-based applications. “Hortonworks is dedicated to expanding and empowering the Apache Hadoop ecosystem, accelerating innovation and adoption of Open Enterprise Hadoop,” said Chris Sullivan, vice president of global channels and alliances at Hortonworks. “We are pleased to help make Robin Cloud Platform, the latest Robin solution to achieve HDP Integration Certification, available to our customers as part of this commitment. Joint Webinar with Hortonworks to Provide More Details IT and database administrators are often met with challenges when it comes to deploying right sizing, and the ability to meet seasonal peaks without disrupting availability in a big data and Hadoop deployment. Presenting a joint webinar on May 25 at 11:00 AM PDT (2 PM EDT), executives from Robin Systems and Hortonworks will discuss these operational complexities. Attendees will learn how to overcome such obstacles without adversely impacting application performance, business continuity and overall productivity. The webinar will feature presenters, Eric Thorsen, VP, Industry Solution at Hortonworks, and Deba Chatterjee, Director of Products at Robin Systems.  Register at: http://containers.robinsystems.com/webinar/managing_seasonal_data_peaks Robin is transforming the way enterprise applications drive the infrastructure by bringing together purposely built container aware block storage with application aware fabric into the cloud - private and/or public; demonstrating unique benefits to distributed, clustered and stateful applications featuring Big-Data and Databases. With a team that includes industry veterans from leading enterprise technology companies such as NetApp, Oracle and Veritas, Robin seeks to disrupt the $20 billion-plus virtualization market with its container-based compute and storage platform software that delivers better performance higher consolidation and a much simpler application lifecycle management than traditional hypervisor-based virtualization. Founded in 2013, the San Jose California-based company has raised more than $27 million in venture funding from leading investors such as Clear Ventures DN Capital USAA and SAP’s Hasso Plattner Ventures. Robin Systems, the Robin Systems logo and Robin Cloud Platform for Enterprise Applications and Application-to-Spindle Quality of Service Guarantee are trademarks or registered trademarks of Robin Systems, Inc., and are protected by trademark laws of the United States and other jurisdictions. All other product and company names are trademarks or registered trademarks of their respective companies. Hortonworks, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and its subsidiaries in the United States and other jurisdictions.


    SAN JOSE, Calif., June 13, 2017 /PRNewswire/ -- (DataWorks Summit/Hadoop Summit) -- IBM (NYSE: IBM) and Hortonworks (NASDAQ: HDP) today announced an expansion to their relationship focused on extending data science and machine learning to more developers and across th...


News Article | June 13, 2017
Site: www.prnewswire.com

A panel of industry experts selected the following as the 2017 winners: TMW Systems uses HDP to develop Business Intelligence and Big Data tools that offer valuable business insights to transportation application users. DHISCO manages over 14 billion shopping requests per month for over 100,000 hotels and 400 brands. With HDP, the company has developed an innovative and reliable distribution technology to track bookings, rates, availability and hotel content which has driven new business insights and increased revenue. Walgreens has embarked on several Big Data initiatives, including a data-warehouse migration with HDP that allows greater scalability for a fraction of the cost. This has increased Walgreens' data footprint and expanded into new use cases touching nearly every aspect of the business. Yale New Haven Health created a continuous patient monitoring solution that records patient monitoring data, streamed in real time, from intensive care units and emergency departments. The data is processed and used for clinical and translational research projects such as sepsis prediction in both adults and neonatal intensive care unit patients, mortality and outcome prediction and for operational programs that work to reduce alarm fatigue and noise due to unnecessary alarms. "IBM is pleased to participate in the Hortonworks Data Heroes program judges panel and, for the first time, award a Hortonworks Data Hero with Cognitive Honors from IBM," said Paul Zikopoulos, VP of Cognitive Big Data Systems at IBM. "IBM and Hortonworks are committed to helping clients adopt machine and deep learning along their analytics journey. We're thrilled to recognize the visionary and transformational leadership of a Data Hero who is delivering new business value by bringing cognitive analytics on-line for their organization." Hortonworks thanks the judges panel of industry experts who selected the winners: Hortonworks is an industry-leading innovator that creates, distributes and supports enterprise-ready open data platforms and modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. Hortonworks is focused on driving innovation in open source communities such as Apache Hadoop, Apache NiFi and Apache Spark. Along with its 2,100+ partners, Hortonworks provides the expertise, training and services that allow customers to unlock transformational value for their organizations across any line of business. Hortonworks, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and its subsidiaries in the United States and other jurisdictions. For more information, please visit www.hortonworks.com. All other trademarks are the property of their respective owners. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/hortonworks-congratulates-2017-americas-data-heroes-award-winners-300472877.html


News Article | May 8, 2017
Site: www.techrepublic.com

Big data promises much in terms of business value, but it can be difficult for businesses to determine how to go about deploying the architecture and tools needed to take advantage of it. Everything from descriptive statistics to predictive modeling to artificial intelligence is powered by big data. And what an organization wants to accomplish with big data will determine the tools it needs to rollout. SEE: Open source big data and DevOps tools: A fast path to analytics applications (Tech Pro Research) At the 2017 Dell EMC World conference on Monday, Cory Minton, a principal systems engineer for data analytics at Dell EMC, gave a presentation explaining the biggest decisions an organization must make when deploying big data. Here are six questions that every business must ask before getting started in the space: The first question to ask is whether your organization wants to buy a big data system or build one from scratch. Popular products from Teradata, SAS, SAP, and Splunk can be bought and simply implemented, while Hortonworks, Cloudera, Databricks, Apache Flink can be used to build out a big data system. Buying offers a shorter time to value, Minton said, as well as simplicity and good value for commodity use cases. However, that simplicity usually comes with a higher price, and these tools usually work best with low diversity data. If your organization has an existing relationship with a vendor, it can be easier to phase in new products and try out big data tools. Many of the popular tools for building a big data system are cheap or free to use, and they make it easier to capitalize on a unique value stream. The building path provides opportunities for massive scale and variety, but these tools can be very complex. Interoperability is often one of the biggest issues faced by admins who go this route. Batch data, offered by products like Oracle, Hadoop MapReduce, and Apache Spark, are descriptive and can handle large volumes of data, Minton said. They can also be scheduled, and are often used to build out a playground of sorts for data scientists to experiment. Products like Apache Kafka, Splunk, and Flink provide streaming data capabilities that can be captured to create potentially predictive models. With streaming data, speed trumps data fidelity, Minton said, but it also offers massive scale and variety. It's also more useful for organizations that subscribe to DevOps culture. Twitter is one example of lambda architecture. Data is split into two paths, one of which is fed to a speed layer for quick insights, while the other path leads to batch and service layers. Minton said that this model gives an organization access to both batch and streaming insights, and balances lossy streams well. The challenge here, he said, is that you have to manage two code and app bases. Kappa architecture treats everything as a stream, but it's a stream that aims to maintain data fidelity and process in real time. All data is written to an immutable log that changes are checked against. It is hardware efficient, with less code, and it is the model that Minton recommends for an organization that is starting fresh with big data. Public and private cloud for big data require many of the same considerations. For starters, an organization must consider what environment their talent is most comfortable working in. Also, data provenance, security and compliance needs, and elastic consumption models should also be thought of. Years ago, the debate around virtual vs. physical infrastructure was much more heated, Minton said. However, virtualization has grown to become competitive with physical hardware in a way that they have become similar in regards to big data deployments. It boils down to what your administrators are more comfortable with and what works for your existing infrastructure. Direct-attached storage (DAS) used to be the only way to deploy a Hadoop cluster, Minton said. However, now that IP networks have increased their bandwidth, the network-attached storage (NAS) option is more feasible for big data. With DAS, it is easy to get started, and the model works well with software-defined concepts. It's driven to handle linear growth in performance and storage, and it does well with streaming data. NAS handles multi-protocol needs well, provides efficiency at scale, and it can address security and compliance needs as well.


News Article | May 10, 2017
Site: www.prnewswire.com

Hortonworks is an industry leading innovator that creates, distributes and supports enterprise-ready open data platforms and modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. Hortonworks is focused on driving innovation in open source communities such as Apache Hadoop, Apache NiFi and Apache Spark. Along with its 2,100+ partners, Hortonworks provides the expertise, training and services that allow customers to unlock transformational value for their organizations across any line of business. Hortonworks, Powering the Future of Data, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and its subsidiaries in the United States and other jurisdictions. For more information, please visit www.hortonworks.com. All other trademarks are the property of their respective owners. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/hortonworks-to-participate-in-upcoming-investor-conferences-300454966.html


News Article | May 11, 2017
Site: www.techrepublic.com

Intel has backed some notable companies over the years - investing in Red Hat and VMware - two firms that helped effect major shifts in the IT industry. The chipmaker is hoping Cloudera will generate similar momentum in the field of big-data analytics, and in doing so open new avenues for growth in a stagnant enterprise IT market. To this end Intel has invested $740m in Cloudera, giving it an 18 percent stake in the company. Cloudera builds and supports tools to run on top of Apache Hadoop, the open-source software framework that allows data to be processed by clusters of commodity hardware for data warehousing and big-data analytics. Cloudera's distribution of Hadoop (CDH) and its subscription offering, Cloudera Enterprise, include various integrated tools to help businesses store and analyse data in Hadoop clusters, offering improved security and availability. Cloudera provides software to support real-time SQL and search-engine queries, machine learning, security, and stream and batch data processing, as well as to manage Hadoop clusters. The firm is one of several competing to offer the Hadoop distribution of choice for businesses. Each of the companies behind major Hadoop distributions - Hortonworks, IBM, MapR and Pivotal - provides different tools to manage, secure and exploit data stored on Hadoop clusters. But usage figures indicate that Cloudera's distribution is the most popular. Intel had released its own distribution of Hadoop but this will now be withdrawn. Intel engineers will instead work on Cloudera's distro, which will be enhanced with features from Intel's platform. While analysts estimate that Cloudera's paying user base may be tiny at present - about 350-strong and growing at about 50 new customers per quarter - Intel said it is buying into future potential. "It's not really a technology play but it really is about overall business value. If you look at Intel's datacentre business over the past few years, the cloud service provider segment, the telecommunications and even the high-performance computing segments have all grown quite handsomely. But the enterprise segment has been a little bit stagnant," Boyd Davis, general manager of Intel's datacentre software division, said. "What you see with big data is a different phenomenon occurring. It's injecting more investment into the IT world because there's such huge business value that gets derived from it, and that's the way I expect to see dramatic growth in our business." But why did Intel decide against exploiting that growth with its own Hadoop distribution and instead chose to back Cloudera? Davis said Intel wanted to boost Cloudera's already strong standing in the Hadoop market and reassure companies unsure which distribution to deploy that Cloudera will be a good long-term investment. "The Hadoop ecosystem is still relatively nascent, when you compare it with the $100bn data-management market, and it's really important for us to take the risk out for customers," he said. "Enterprises like to know this is the right path, so they don't have to sit on the sidelines and wait to see how the market plays out. That was important for us as well because we want to see this market grow." Intel's is now Cloudera's largest strategic investor, defined by Cloudera as investors where there is "alignment between corporate initiatives". The $740m investment by Intel was preceded by a cash injection of $160m into Cloudera by a variety of firms, including Google's investment arm. About 60 percent of the combined $900m investment will end up in Cloudera's pockets, according to Cloudera CEO Tom Reilly, as some of the money will go to existing investors in Cloudera. "We've raised more than half a billion dollars that goes into Cloudera," Reilly said. Initially, Cloudera will use the funding to help organisations move from Intel's Hadoop distribution to its own. "We're hiring up engineers on our side to interface and integrate with Intel's engineering team, so we have the staff to support the partnership on the technical side of things," Reilly said. "We're going to be transitioning all Intel customers to our new distribution, which combines the best of our distributions." Reilly sees the partnership with Intel as a springboard to accelerate its ambitions for global expansion. "Intel has a tremendous presence in China and India. The next thing we're going to do is to staff up and build up resources in those geographies to support the customers and continue to grow those big markets." Both firms plan to increase their contributions to open-source projects related to Hadoop, with Reilly expressing interest in projects focused on in-memory processing, such as Apache Spark, and security. Finally, the company will also use the money to help it acquire companies, "to accelerate our growth", according to Reilly. The company still plans to go public but Reilly said it is not "setting an expectation" as to when an IPO might occur. Unsurprisingly, Intel's investment will result in engineers from both companies focusing on optimising Cloudera's toolset, as well as the core open-source Hadoop platform, to run on Intel's 64-bit x86 chip architecture. "Hadoop will continue to work on all platforms, but the optimisations will occur on Intel sooner and faster," Reilly said. "Intel has 94 percent market share in the datacentre. We believe the Intel platform is going to outperform other platforms." The stance is something of a departure from a public statement made by a co-founder of Cloudera last year, when the company's CTO praised low-power ARM chips for being more efficient than competing silicon from other companies. In a discussion about ARM-based processors at the time, Cloudera co-founder and CTO Amr Awadallah was reported as saying: "Cores from other vendors - without saying their name - consume significantly more power in the idle state, hence we're relieved that ARM is moving into this space." Intel and Cloudera have a "multi-year roadmap" of features in Intel hardware that will be exploited by Cloudera's distribution of Hadoop, and Intel's Davis said the first fruits of this collaboration are likely to be revealed in the near future. "A really good example of one of the areas where we are collaborating that will show up in Cloudera products very soon is around hardware-accelerated security," he said. "In our own distribution we took advantage of instructions in the Xeon chip that accelerate encryption, so that customers could encrypt the data in a Hadoop environment without necessarily having the performance overhead of many of the solutions out there. "We had that intimate knowledge of the instructions that could accelerate the security algorithms. We built that into our distribution and are actively working to get that into Cloudera's product as quickly as we can." When Intel launched its own Hadoop distribution last year, it promised that extensions to instruction sets in its chips would boost performance in various ways: improving data encryption speed via AES-NI and compression using AVX and SSE 4.2. Various optimisations from Intel's Hadoop distribution will begin to be incorporated into CDH, following the release of version 3.1 of the Intel distro, the final outing for the platform. Reilly said the firms' engineering collaboration and the absorption of Intel's distribution into Cloudera's platform will yield enhancements to Cloudera's offering "not just five years from now but in the coming months". Davis expects the bulk of the collaboration between the companies will be on improvements to the open-source, core Hadoop platform, but added they will also work to improve Cloudera's proprietary tools on top of Hadoop. "It's one of our fundamental objectives to maintain an open ecosystem, and Intel's going to continue to do engineering work and contribute to the open-source community," Davis said. "We'll also continue to innovate in some of the areas around Hadoop that are not open source, on things like the management and data governance that are around Hadoop but not in the core platform. A lot of people have unique technologies there, and we will work with Cloudera on those." On rare occasions there may also be other considerations that prevent their combined engineering teams from open-sourcing technologies, he said. "There are certain cases where open source has some downsides. Security is an example. I don't have a specific example but sometimes you want to do something to take advantage of security capabilities in the chip that if you were to make open source would actually open up security holes. But the vast majority of the innovations that we drive are going to end up in open source." Reilly said there was a natural crossover between the capabilities of the Hadoop platform to handle large volumes of data and Intel's investment in the internet of things, which is expected to fuel an explosion in data collection and analytics.

Loading Hortonworks collaborators
Loading Hortonworks collaborators