While I like the idea, in principle, that you don't need a CS education to use AI/ML, I doubt it. Here's a problem that cropped up today: Our instance ran out of hard drive space on a training set of ~400,000 images. The individual images were only 375 GB, but took up 1.5T when converted to Numpy matrices. Why? The arrays were converted to standard int arrays (32-bit x 3 channels) when they could've fit into short (8-bit x 3 channels). Each image was 4x as large as it needed to be.
You can certainly use high-level ML tools (like Keras), but it takes a great deal of work to wrangle your data into a usable format, and even more knowledge to debug an ineffective network.
You can certainly use high-level ML tools (like Keras), but it takes a great deal of work to wrangle your data into a usable format, and even more knowledge to debug an ineffective network.