Databricks is rapidly becoming one of the most common tools used to analyze data for machine learning, especially if the desired solution algorithm is very intensive, like those used in deep learning. The demos in this session will walk through what is required to create a deep learning solution in Databricks using Jupyter Notebooks, Python and Tensorflow to create a deep understanding of a dataset to create a solution which can be scheduled to run at a given time or scaled to handle big data as part of the solution in a real-world environment with multiple developers working on the project with integrated source control.
You will learn:
- Why the collaborative work features and management improve the process of creating, managing, and implementing a deep learning solution in Databricks as compared to other tools
- The steps needed to create a solution and how the process can be implemented using the same release practices deployed in more traditional development environments.
- How through examples, when deep learning should be considered and which scenarios are best suited for Tensorflow. The results of the experiments will be carefully reviewed to ensure that the solution is a good one and we didn’t create an overfitting scenario common with Tensorflow