top | item 30940571

(no title)

overkalix | 3 years ago

Greg probably also knows SAS and AMPL, and has a good knowledge of ops research, which is within stone-tossing distance of whatever ML is pretending to be this week.

discuss

order

NumberCruncher|3 years ago

After 15 years of experience with SAS this sounds to me like saying "knowing how to write and having a pen makes you to a poet". But it depends on how far you can toss a stone...

whatever1|3 years ago

OR and ML have their own space in manufacturing.

OR is perfect when you can describe explicitly what the decision space is and what the restrictions are.

ML is great fit when you want to identify and use patterns. Quality control with machine vision is a good application for ML. NLP for PDF documents is a huge field for manufacturing as well. Companies have so much data in email attachments that they do not currently take advantage of.

overkalix|3 years ago

> OR is perfect when you can describe explicitly what the decision space is and what the restrictions are.

As opposed to having to figure it out later from the outputs of a black box?

> Quality control with machine vision is a good application for ML.

I can't imagine CV could be an actual replacement for actual SPC in many industries. There's a reason we need to take samples and stress test, analyze composition, etc.

> NLP for PDF documents is a huge field for manufacturing as well.

NPL could be big everywhere... if it provides actual value, which is not a given. ML has a lot of tangential applications (you could also say, better forecasting), but how will directly improve manufacturing processes?

I apologize for being abrasive, but I'm so tired of cs people descending upon all industries, plugging shit data into pytorch and doing shitty ML like it will automatically add value. Even more so in industrial engineering, which in my experience is full of people way better at math than computer scientists and requires a deep understanding of the product and the manufacturing process.

andrewf|3 years ago

A tangent, if you have time: where would I go for a primer on operations research and/or discrete event simulation?

My thought is that Goldratt's "The Goal" / theory of constraints is a useful way of thinking about optimizing throughput in a computer system. http://www.qdpma.com/Arch_files/RWT_Nehalem-5.gif plus an instruction latency table is something like a well modeled factory. (The Phoenix Project applies these principles to project management, which I think is a somewhat less useful analogy!)

I'm curious about applying existing tools to modeling things like: how will this multi-tiered application behave when it gets a thundering herd of requests? What if I tweak these timeouts, adjust this queue, make a particular system process requests on a last-in-first-out basis? Can I get a pretty visualization of what would happen?

tuxguy|3 years ago

lol-ing at "Whatever ml is pretending to be this week"

so funny, because so accurate :)