The trinity of errors in applying confidence intervals: An exploration using Statsmodels
Three reasons why confidence intervals should not be used in financial data analyses.
Our take on the ideas, information, and tools that make data work.
Three reasons why confidence intervals should not be used in financial data analyses.
Multi-layer architecture, scalability, multitenancy, and durability are just some of the reasons companies have been using Pulsar.
Resolving the volatility problem will unlock the groundwork needed for blockchain-based global payment systems.
Or, why science and engineering are still different disciplines.
Why companies are turning to specialized machine learning tools like MLflow.
Drawing insights from recent surveys, Ben Lorica analyzes important trends in machine learning.
NLP systems in health care are hard—they require broad general and medical knowledge, must handle a large variety of inputs, and need to understand context.
An exploration of three types of errors inherent in all financial models.
Chad Jennings explains how Geotab's smart city application helps city planners understand traffic and predict locations of unsafe driving.
Ben Sharma shares how the best organizations immunize themselves against the plague of static data and rigid process
Dinesh Nirmal explains how AI is helping supply school lunch and keep ahead of regulations.
Ziya Ma discusses how recent innovations from Intel in high-capacity persistent memory and open source software are accelerating production-scale deployments.
Ted Dunning discusses how new tools can change the way production systems work.
Drew Paroski and Aatif Din share how to develop modern database applications without sacrificing cost savings, data familiarity, and flexibility.
DD Dasgupta explores the edge-cloud continuum, explaining how the roles of data centers and cloud infrastructure are redefined through the mainstream adoption of AI, ML, and IoT technologies.
This collection of data governance resources will get you up to speed on the basics and best practices.
Get a basic overview of data engineering and then go deeper with recommended resources.
Jean-François Puget explains why human context should be embraced as a guide to building better and smarter systems.
William Vambenepe walks through an interesting use case of machine learning in action and discusses the central role AI will play in big data analysis moving forward.
Anoop Dawar shares principles successful companies are using to inspire an insight-driven ethos and build data-competent organizations.
Dinesh Nirmal explains how real-world machine learning reveals assumptions embedded in business processes that cause expensive misunderstandings.
Tobias Ternstrom explains why you should objectively evaluate the problem you're trying to solve before choosing the tool to fix it.
Alysa Hutnik discusses the Fair Credit Reporting Act, the Equal Credit Opportunity Act, the Gramm-Leach Bliley Act, and the FTC’s focus on FinTech.
How companies such as athenahealth can transform legacy data into insights.