Use large datasets to train machine learning models that can be exported to Tensorflow.
Platform: Google AutoML
I trained a machine learning model to filter tweets posted by people about their personal experiences with various subject hash tags. The model is useful for surfacing research insights, especially for a topic that consists of a lot of noise, chatter and click bait.
Trained and progressively tweeked a machine learning model to distinguish between healthy and infected individuals from a large dataset of chest x-rays. (This exercise was a part of the Udacity AI Product Manager Nanodegree course.)
I used tabular machine learning to predict the most relevant research questions in a survey. This could help to reduce the total number of questions asked in future surveys, with the least impact on the outcome.