Posts

Showing posts from September, 2019

2019 Datanami Readers’ and Editors’ Choice Awards

Image
Datanami  is pleased to announce the results of its fourth annual Readers’ and Editors’ Choice Awards, which recognizes the companies, products, and projects that have made a difference in the big data community this year. These awards, which are nominated and voted on by Datanami readers, give us insight into the state of the community. We’d like to thank our dedicated readers for weighing in on their top picks for the best in big data. It’s been a privilege for us to present these awards, and we extend our congratulations to this year’s winners. Best Big Data Product or Technology: Machine Learning Readers’ Choice: Elastic Editor’s Choice: SAS Visual Data Mining & Machine Learning Best Big Data Product or Technology: Internet of Things Readers’ Choice: SAS Analytics for IoT Editor’s Choice:  The Striim Platform Best Big Data Product or Technology: Big Data Security Readers’ Choice: Cloudera Enterprise Editor’s Choice: Elastic Stack Best Big Data Product o

The Forrester Wave™: Streaming Analytics, Q3 2019

Image
Key Takeaways Software AG, IBM, Microsoft, Google, And TIBCO Software Lead The Pack Forrester's research uncovered a market in which Software AG, IBM, Microsoft, Google, and TIBCO Software are Leaders; Cloudera, SAS, Amazon Web Services, and Impetus are Strong Performers; and EsperTech and Alibaba are Contenders. Analytics Prowess, Scalability, And Deployment Freedom Are Key Differentiators Depth and breadth of analytics types on streaming data are critical. But that is all for naught if streaming analytics vendors cannot also scale to handle potentially huge volumes of streaming data. Also, it's critical that streaming analytics can be deployed where it is most needed, such as on-premises, in the cloud, and/or at the edge. Read report >>>

Gartner - The CIO’s Guide to Blockchain 2019

Image
More than $4 trillion in goods are shipped globally each year. The 80% of those goods carried via ocean shipping creates a lot of paperwork. Required trade documentation to process and administer all the goods is approximately one-fifth of the actual physical transportation costs. Last year, a logistics business and a large technology company developed a joint global trade digitalization platform built using blockchain technology. It will enable them to establish a shared, immutable record of all transactions and provide all disparate partners access to that information at any time. Although the distributed, immutable, encrypted nature of blockchain solutions can help with such business issues, blockchain can achieve much more than that. Large companies looking to explore new disruptive business opportunities need to think beyond efficiency gains. And to do so, they need real blockchain solutions. Full article >>>

Neo4j Backs Launch of GQL Project: First New ISO Database Language Since SQL

Image
Neo4j, the leader in graph databases, announced today that the international committees that develop the SQL standard have voted to initiate GQL (Graph Query Language) as a new database query language. Now to be codified as the international standard declarative query language for property graphs, GQL represents the culmination of years of effort by Neo4j and the broader database community.  English: GQL to incorporate and consider several graph database languages. Cypher: (:Neo4j)-[:BACKS]->(GQL:Project)<-[:STARTED]-(:ISO)-[:STANDARDIZED]->(SQL:Project) The initiative for GQL was first advanced in the GQL Manifesto in May 2018. A year later, the project was considered at an international gathering in June. Ten countries including the United States, Germany, UK, Korea, and China have now voted in favor, with seven countries promising active participation by national experts. It has been well over 30 years since ISO/IEC began the SQL project. SQL went on to become the

Distributed SQL System Review: Snowflake vs Splice Machine

Image
After many years of Big Data, NoSQL, and Schema-on-Read detours, there is a clear return to SQL as the lingua franca for data operations. Developers need the comprehensive expressiveness that SQL provides. A world without SQL ignores more than 40 years of database research and results in hard-coded spaghetti code in applications to handle functionality that SQL handles extremely efficiently such as joins, groupings, aggregations, and (most importantly) rollback when updates go wrong. Luckily, there is a modern architecture for SQL called Distributed SQL that no longer suffers from the challenges of traditional SQL systems (cost, scalability, performance, elasticity, and schema flexibility). The key attribute of Distributed SQL is that data is stored across many distributed storage locations and computation takes place across a cluster of networked servers. This yields unprecedented performance and scalability because it distributes work on each worker node in the cluster in parall

Dremio 4.0 Data Lake Engine

Image
Dremio’s Data Lake Engine delivers lightning fast query speed and a self-service semantic layer operating directly against your data lake storage. No moving data to proprietary data warehouses or creating cubes, aggregation tables and BI extracts. Just flexibility and control for Data Architects, and self-service for Data Consumers. This release, also known as Dremio 4.0, dramatically accelerates query performance on S3 and ADLS, and provides deeper integration with the security services of AWS and Azure. In addition, this release simplifies the ability to query data across a broader range of data sources, including multiple lakes (with different Hive versions) and through community-developed connectors offered in Dremio Hub. Read full article >>>