As someone who does manufacturing automation for a living, I don't think this person has any experience in a manufacturing environment.
Coding is really not a limiting factor in manufacturing automation. For decades everything has been built around being controlled by first relay logic and then PLCs, which implement multiple languages like ladder-logic which are basically already drag and drop. Most sensors are simple - generally they provide a boolean on/off signal, and occasionally there's an analogue output which is pretty easy to interpret. Physically placing and wiring the sensors correctly is the hard bit, not interpreting the data. There are more advanced systems, like computer vision, but they all already have user interfaces which allow you to interact with them without coding. In fact it's actually a real pain in the ass that in most cases there is no coding option, and thus everything has its own proprietary and arcane method of operation and its unreasonably difficult to get things to talk to each other.
Further, generally coding skill is not particularly lacking. Assembly-like languages such as G-Code are widely used by machines and many tradesmen know enough to hand edit at least simple programs, and the engineers on staff are generally comfortable with more advanced programming. While they're a far cry from software developers, but it's sufficient for the relatively simple cases that a no-code solution might be suitable for.
The main issue for automation in manufacturing is not versatility but reliability. A few hours of downtime can cost tens of thousands of dollars, and a machine crash might easily cost 8 figures, to say nothing of the potential for injury or even death. Whatever difficulty there is in coding for manufacturing environments is in structuring these programs such that they are consistently accurate and fail safe. Personally, I don't find anything that makes it easier for someone to tell a machine to do something stupid particularly appealing.
I worked in this sort of space for awhile. You got it in your last paragraph. Downtime costs money. Not only does downtime cost money, service calls do too. You get a guy who knows exactly how to fix it out on site and he shows up and has the wrong part and he costs 200 bucks an hour. You can burn 4 hours just waiting for a part. We were having a lot of good luck adding some sensors, remote diagnostic, and pre-diagnostic. That way you could schedule a downtime for a part swap out and have the right part on hand. Programming and remote control was near the bottom of that money stack. Like you said most of the stuff is PLC and relay logic. Sometimes you get lucky and there is a decent computer in the middle and you can scan that, but be careful as the 485 line may already be saturated with commands.
> generally coding skill is not particularly lacking
In my experience it is. Sure there are people who can go in and hack some simple code, but they typically know nothing about the clean code style that you need to make millions of lines manageable. They do okay with 500 lines programs where it isn't hard to know the whole, but don't put them on anything more complex
As a controls engineer who has integrated dozens of automated manufacturing cells, I have to say: none of the salespeople I interact with have buzzword densities this high! As nbzso mentioned, the higher you climb the less oxygen there is...
There's definitely a need for some kind of simplified programming that's understandable at different levels. Simplest is for an operator, then by a maintenance technician, by a line engineer, by a controls engineer at an integrator (like me), and by applications engineers at manufacturers and distributors. The operator needs a minimal number of adjustments - this part number might have a harder material and need to run at a reduced speed, or be highly porous and need more adhesive dispensed. That kind of hour-by-hour adjustment is typically custom-built into an HMI by the controls engineer. The tools themselves need to be debugged by maintenance techs, the robot code on a teach pendant or the ladder logic in a PLC are those languages. Hopefully, the maintenance tech doesn't have to touch the structure of the program, just some of the conditions or values for certain steps. Programs increasingly expose data at certain parts of the cycle to external databases. That's the "4IR" or "Fourth Industrial Revolution" or "Industry 4.0" component, or just a spreadsheet export, depending on your altitude. T
Whether that data is analyzed in Excel, an MES dashboard, or a Python script depends on the consumer.
There are DSLs at all levels, none of those people are writing assembly language. But "No code" proponents typically remove too much flexibility to the level of the implementor of the product, far removed from the end user.
Came here to say the same thing. I've been in automation for a loong time and this article seems like an AI-generated jumble of buzzwords.
Maybe in adjacent fields no code and IoT is a thing but in mainstream manufacturing automation I'm not really encountering it. It feels like we just moved a bit past PLCs and proprietary field busses. The main added value is not normally in rapidly changing SW that's generated by non-programmers.
> But "No code" proponents typically remove too much flexibility to the level of the implementor of the product, far removed from the end user.
Not only do they remove flexibility, but they often introduce "magic" that is too difficult to debug when dealing with live industrial systems. It's fairly simple to go through a bunch of iterations with a debugger in a standard programming environment. As soon as your dealing with a sprawling realtime physical system, you can't do that.
> So, what is no-code? No-code development is perfect for engineers who have zero programming knowledge. It’s based on a visual drag-and-drop user interface, with no hand coding required to reach the end goal. It empowers workers to design solutions they need to overcome the challenges they face every day—without writing a single line of code. In 2019, 84% of enterprises across the US, UK, Canada and Australia had already implemented a low-code development tool or platform to take care of some of their coding needs.
This piece is a fluff piece at best. Looks like a "guest blog" entry to advertise for the writer's company [0]
The problem with no/low code is it mirrors a DSL for whatever problem being solved. Anything more general becomes a programming language.
Maybe my understanding of language is limited? I'd love to be wrong about this.
My limited experience with "no/low code" has been that it sacrifices the reuse of logic.
I don't understand how that can possibly substitute for programming, because the reuse and composition of logic is the whole point...isn't it?
It's also the source of complexity, so just making a visual, drag and drop interface isn't going to make development more accessible, I would think. Less, in that it's more cumbersome and expands visually.
My assumption has been that this is an emperor with no clothes that is invariably sold to decision makers who won't use it.
> The problem with no/low code is it mirrors a DSL for whatever problem being solved. Anything more general becomes a programming language.
Yes! I've explored visually representing functions, and state machines, but it all feels very limiting compared to written language. I think more research could be done in the visual coding area for expressing thought, because I really want coding to look like something out of neon genesis evangelion--but IDK, written language isn't bound spatially. And in written language, it's so easy to express contradictions, which is an important part of developing a system. I think MC Escher is probably the most advanced research we have in the area of visually representing recursion and contradictions lol.
> The problem with no/low code is it mirrors a DSL for whatever problem being solved.
This is the entire advantage too. If you read this like a hard constraint, your new job is to simply make sure you develop a high-quality model of the problem domain. The rest could be put on the business owners, especially if you can teach them a little bit of SQL (the world's most popular DSL).
So there's really two problems that are being conflated in the phrase "no code":
1. We need a richer library of standardized DSLs for various real-world environments, where the power of a full PL is perhaps too dangerous without excessive training.
2. The method of specifying algorithms in plain text (i.e. "code") is not intuitive or unapproachable for large numbers of people who could otherwise benefit greatly from the ability to "program" the tools they use.
I wish people would stop claiming that drag and drop program creation is not programming. It is simply a different style of programming using a different language.
From Wikipedia:
"Computer programming is the process of designing and building an executable computer program to accomplish a specific computing result or to perform a specific task."
Outside of tech, I think coding means green text on a black background--but I think of it more generally, describing a process that you want something/someone else to run... a cooking recipe for someone to follow, it's CODE! So yeah, to that end, visual programming is still coding. AI that generates code, well, you still have to tell the AI what to do, still coding--on that tangent, if AI code generation ever is perfected, I'd rather work in that language, and not even look at the output like assembly.
I really want visual programming to succeed, but there's something fundamentally harder with it. I'm starting to think written language can't be beat for expressing thought. Visual programming is probably a great introduction to computer programming for people to quickly automate simple tasks, but I rest assured that as soon as they want to describe any complex process, they're going to fall into written language like the rest of us--or do lots of copying and pasting and create a mess. To write is to think, I think.
I don't really care if it's called programming or not but whenever I've inherited LabView or similar type of project I spent more time trying to figure out what the "programmer" tried to do than it would take to rewrite from scratch in a traditional language/environment.
I often claim software is a new form of literacy. And I see no-code as the equivalent of cutting up all the frames from marvel comics and tippexing out the speech bubbles, then handing them to some illiterate and saying "hey, you can write your own story now !"
It's not how we (as a society) need to handle this.
I think the issue with your analogy is that no-code programming is more like cutting up all of the idioms and narrative elements (in abstract) from Marvel comics and offering them to an illiterate and saying "Tell me a story using these parts". Ultimately, storytelling is composed of tropes. Every once in a while a breakthrough is made in aesthetic/narrative style, but that's just a new feature for a no-code platform. Not everyone needs to be writing bleeding edge or avant-garde software – a lot of people just need simple and reliable internal tools.
As a side-effect of pushing "automation" agenda corporations will not have needed workforce to accomplish the task.
The marketing delusions and wishful thinking are popular methods of self-destruction.
Sadly in the corporate world, the more you climb the less oxygen you get. And with lack of oxygen human brain is prone to malfunction.
You can have some form of automation, but the need of competent programmers will not vanish into the thin air.
>Is PHP worth learning in 2021?
The first google result:
PHP is an open-source programming language that is completely free, and because it supports all the main browsers, it is highly scalable. ... PHP is not dying and is definitely worth learning in 2021 and beyond. There are still thousands of jobs available for new PHP programmers
As a consultant in the business automation and data space, the promised "paradigm-shift" is sort-of accurate but a little too optimistic in what the author believes the implications are.
No-code/low-code has been a game changer in many ways for businesses. It lets people dip their toes into improving their efficiency and productivity with low-hanging automations and simple apps. These things can have huge payoffs. And if you stick within that realm, it actually is a paradigm-shift to be able to build solutions as a non-tech person.
But there's always a point where you:
1. need more complex no-code/low-code solutions to get big pay-offs
2. have a mess of so many no-code/low-code solutions that they end up causing tons of maintenance/issues and you have trouble getting all your separate solutions to play nicely together
You end up spending a lot of money to get someone who is very advanced with those no-code/low-code tools (which usually ends up with code solutions inside of those tools), or end up building completely custom solutions anyway.
Also, there are a lot of nuances in what could be considered no/low-code. This article isn't too clear on which types of applications are considered no/low-code, and throws out everything from cellular-IoT infrastructure to MES. Is something like Tableau low-code? Or are we talking more about the Zapier kind of tools? Both of those examples have giant roadblocks once you get to a certain point, where you need some type of dev/architect knowledge to really get the most use out of the tools.
Dare I say its because AI succeeded. AI is no longer a barrier to entry, there's plenty of off the shelf libraries and python examples out there to get you going.
The result is, that a lot of under the hood intelligence is automated. And the inputs, outputs and process selection instead of process design are all that's left to hype up as a differentiator in an ever increasingly commoditized market.
It seems like every few years, we get to experience a wave of 'no code' solutions with varying degrees of success. LiveCode isn't too bad, but there is still some code there. Which bring me to my next point, the way it is described, this solution still appears to be coding... under guise of moving things around in GUI. Don't complex programs have the same kind of flow at the very outset of the design process?
I dunno. Maybe its time to get coffee. It is being a long day.
The fundamental problem with no-code/low-code products is that they are a solution to the wrong problem. To an outsider, code is basically indecipherable. Just like weird math symbols, bizarre physics equations, scientific jargon, etc.
Because they don't know how to read code (or substitute in any of the other examples I gave), the outsider assumes that that's the hard part: "If the code/jargon/equation was simply in plain English, anybody could program/do science/understand math".
But they're wrong. Anybody who spends enough time to pick up the actual underlying skill (whether it be programming, mathematics, or science) will trivially pick up the language of the trade, which, on the whole, exists because it expresses complicated ideas in a convenient way.
The hard part of programming is solving the actual problems, not expressing those solutions in code. Likewise, the hard part of electrical engineering isn't reading circuit diagrams, or the hard part of being an author is knowing English. Thus, if you want to make programming your system easier, you don't throw out the idea of code. You solve as many of the hard problems as you can, and then supply those solutions as a library.
I work for a "no-code" app (Kalipso Studio) with focus in logistics and shop floor applications. But the programmer needs to have some basic programming skills and at a little bit of SQL. This guy is a bit exaggerated. We have many costumers using this manly to interface PLC that now come with Webservices or have other software to convert the protocol, etc... We have companies using this to control robots, but the movements was natively programing. Now they just use our software to manage the robot, get data/statistics, and load different parameters. I don't think the "no-code" is for people with less skills. Our clients buy this because they what to be more productive, or they can't or don't want to buy the solution from a external company. Governments and military also don't like external companies, so they buy to use internally like big companies. And apps in the industries are always changing, so they need to be fast to adapt. They don't need a very fancy interface, most of them will never be published, they are not building a Tiktok app.
Get an adblocker - with ublock on desktop I can see literally zero ads on this site (except the sharebar at the end which, honestly, is rather unobtrusively placed)
With a title like that, I thought about mechanical engineering. Humanity built machines to "automatically" perform tasks for the last 100+ years. Factories seem to always aspire toward automation at any technological level, even if fully mechanical.
[+] [-] jjk166|4 years ago|reply
Coding is really not a limiting factor in manufacturing automation. For decades everything has been built around being controlled by first relay logic and then PLCs, which implement multiple languages like ladder-logic which are basically already drag and drop. Most sensors are simple - generally they provide a boolean on/off signal, and occasionally there's an analogue output which is pretty easy to interpret. Physically placing and wiring the sensors correctly is the hard bit, not interpreting the data. There are more advanced systems, like computer vision, but they all already have user interfaces which allow you to interact with them without coding. In fact it's actually a real pain in the ass that in most cases there is no coding option, and thus everything has its own proprietary and arcane method of operation and its unreasonably difficult to get things to talk to each other.
Further, generally coding skill is not particularly lacking. Assembly-like languages such as G-Code are widely used by machines and many tradesmen know enough to hand edit at least simple programs, and the engineers on staff are generally comfortable with more advanced programming. While they're a far cry from software developers, but it's sufficient for the relatively simple cases that a no-code solution might be suitable for.
The main issue for automation in manufacturing is not versatility but reliability. A few hours of downtime can cost tens of thousands of dollars, and a machine crash might easily cost 8 figures, to say nothing of the potential for injury or even death. Whatever difficulty there is in coding for manufacturing environments is in structuring these programs such that they are consistently accurate and fail safe. Personally, I don't find anything that makes it easier for someone to tell a machine to do something stupid particularly appealing.
[+] [-] sumtechguy|4 years ago|reply
[+] [-] bluGill|4 years ago|reply
In my experience it is. Sure there are people who can go in and hack some simple code, but they typically know nothing about the clean code style that you need to make millions of lines manageable. They do okay with 500 lines programs where it isn't hard to know the whole, but don't put them on anything more complex
[+] [-] speedybird|4 years ago|reply
[+] [-] LeifCarrotson|4 years ago|reply
There's definitely a need for some kind of simplified programming that's understandable at different levels. Simplest is for an operator, then by a maintenance technician, by a line engineer, by a controls engineer at an integrator (like me), and by applications engineers at manufacturers and distributors. The operator needs a minimal number of adjustments - this part number might have a harder material and need to run at a reduced speed, or be highly porous and need more adhesive dispensed. That kind of hour-by-hour adjustment is typically custom-built into an HMI by the controls engineer. The tools themselves need to be debugged by maintenance techs, the robot code on a teach pendant or the ladder logic in a PLC are those languages. Hopefully, the maintenance tech doesn't have to touch the structure of the program, just some of the conditions or values for certain steps. Programs increasingly expose data at certain parts of the cycle to external databases. That's the "4IR" or "Fourth Industrial Revolution" or "Industry 4.0" component, or just a spreadsheet export, depending on your altitude. T Whether that data is analyzed in Excel, an MES dashboard, or a Python script depends on the consumer.
There are DSLs at all levels, none of those people are writing assembly language. But "No code" proponents typically remove too much flexibility to the level of the implementor of the product, far removed from the end user.
[+] [-] zwieback|4 years ago|reply
Maybe in adjacent fields no code and IoT is a thing but in mainstream manufacturing automation I'm not really encountering it. It feels like we just moved a bit past PLCs and proprietary field busses. The main added value is not normally in rapidly changing SW that's generated by non-programmers.
[+] [-] craftinator|4 years ago|reply
Not only do they remove flexibility, but they often introduce "magic" that is too difficult to debug when dealing with live industrial systems. It's fairly simple to go through a bunch of iterations with a debugger in a standard programming environment. As soon as your dealing with a sprawling realtime physical system, you can't do that.
[+] [-] all2|4 years ago|reply
This piece is a fluff piece at best. Looks like a "guest blog" entry to advertise for the writer's company [0]
The problem with no/low code is it mirrors a DSL for whatever problem being solved. Anything more general becomes a programming language.
Maybe my understanding of language is limited? I'd love to be wrong about this.
[0] https://www.emnify.com/
[+] [-] perl4ever|4 years ago|reply
I don't understand how that can possibly substitute for programming, because the reuse and composition of logic is the whole point...isn't it?
It's also the source of complexity, so just making a visual, drag and drop interface isn't going to make development more accessible, I would think. Less, in that it's more cumbersome and expands visually.
My assumption has been that this is an emperor with no clothes that is invariably sold to decision makers who won't use it.
But maybe there's more to it?
[+] [-] munro|4 years ago|reply
Yes! I've explored visually representing functions, and state machines, but it all feels very limiting compared to written language. I think more research could be done in the visual coding area for expressing thought, because I really want coding to look like something out of neon genesis evangelion--but IDK, written language isn't bound spatially. And in written language, it's so easy to express contradictions, which is an important part of developing a system. I think MC Escher is probably the most advanced research we have in the area of visually representing recursion and contradictions lol.
[+] [-] bob1029|4 years ago|reply
This is the entire advantage too. If you read this like a hard constraint, your new job is to simply make sure you develop a high-quality model of the problem domain. The rest could be put on the business owners, especially if you can teach them a little bit of SQL (the world's most popular DSL).
[+] [-] jkhdigital|4 years ago|reply
1. We need a richer library of standardized DSLs for various real-world environments, where the power of a full PL is perhaps too dangerous without excessive training.
2. The method of specifying algorithms in plain text (i.e. "code") is not intuitive or unapproachable for large numbers of people who could otherwise benefit greatly from the ability to "program" the tools they use.
[+] [-] mcguire|4 years ago|reply
Complete with debugging and maintenance and all of the other things that make "coding" hard.
[+] [-] kwhitefoot|4 years ago|reply
From Wikipedia: "Computer programming is the process of designing and building an executable computer program to accomplish a specific computing result or to perform a specific task."
See https://en.wikipedia.org/wiki/Computer_programming.
The same claims used to be made regarding Programmable Logic Controllers (PLCs), despite the word programming being part of the name.
[+] [-] munro|4 years ago|reply
I really want visual programming to succeed, but there's something fundamentally harder with it. I'm starting to think written language can't be beat for expressing thought. Visual programming is probably a great introduction to computer programming for people to quickly automate simple tasks, but I rest assured that as soon as they want to describe any complex process, they're going to fall into written language like the rest of us--or do lots of copying and pasting and create a mess. To write is to think, I think.
[+] [-] zwieback|4 years ago|reply
[+] [-] lifeisstillgood|4 years ago|reply
It's not how we (as a society) need to handle this.
Edit: i.e. that is not empowering the illiterate.
[+] [-] WFHRenaissance|4 years ago|reply
[+] [-] drewcoo|4 years ago|reply
https://www.qwantz.com
[+] [-] nbzso|4 years ago|reply
As a side-effect of pushing "automation" agenda corporations will not have needed workforce to accomplish the task.
The marketing delusions and wishful thinking are popular methods of self-destruction. Sadly in the corporate world, the more you climb the less oxygen you get. And with lack of oxygen human brain is prone to malfunction.
You can have some form of automation, but the need of competent programmers will not vanish into the thin air.
>Is PHP worth learning in 2021?
The first google result:
PHP is an open-source programming language that is completely free, and because it supports all the main browsers, it is highly scalable. ... PHP is not dying and is definitely worth learning in 2021 and beyond. There are still thousands of jobs available for new PHP programmers
https://www.google.com/search?q=Is+PHP+worth+learning+in+202...
[+] [-] CodeWriter23|4 years ago|reply
I’m calling that out as a straw man. Who uses PHP to run gear / robots?
[+] [-] adflux|4 years ago|reply
[+] [-] dsaavy|4 years ago|reply
No-code/low-code has been a game changer in many ways for businesses. It lets people dip their toes into improving their efficiency and productivity with low-hanging automations and simple apps. These things can have huge payoffs. And if you stick within that realm, it actually is a paradigm-shift to be able to build solutions as a non-tech person.
But there's always a point where you:
1. need more complex no-code/low-code solutions to get big pay-offs 2. have a mess of so many no-code/low-code solutions that they end up causing tons of maintenance/issues and you have trouble getting all your separate solutions to play nicely together
You end up spending a lot of money to get someone who is very advanced with those no-code/low-code tools (which usually ends up with code solutions inside of those tools), or end up building completely custom solutions anyway.
Also, there are a lot of nuances in what could be considered no/low-code. This article isn't too clear on which types of applications are considered no/low-code, and throws out everything from cellular-IoT infrastructure to MES. Is something like Tableau low-code? Or are we talking more about the Zapier kind of tools? Both of those examples have giant roadblocks once you get to a certain point, where you need some type of dev/architect knowledge to really get the most use out of the tools.
[+] [-] helltone|4 years ago|reply
[+] [-] drewcoo|4 years ago|reply
[+] [-] Fordec|4 years ago|reply
The result is, that a lot of under the hood intelligence is automated. And the inputs, outputs and process selection instead of process design are all that's left to hype up as a differentiator in an ever increasingly commoditized market.
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] A4ET8a8uTh0|4 years ago|reply
I dunno. Maybe its time to get coffee. It is being a long day.
[+] [-] jkhdigital|4 years ago|reply
[+] [-] OkayPhysicist|4 years ago|reply
Because they don't know how to read code (or substitute in any of the other examples I gave), the outsider assumes that that's the hard part: "If the code/jargon/equation was simply in plain English, anybody could program/do science/understand math".
But they're wrong. Anybody who spends enough time to pick up the actual underlying skill (whether it be programming, mathematics, or science) will trivially pick up the language of the trade, which, on the whole, exists because it expresses complicated ideas in a convenient way.
The hard part of programming is solving the actual problems, not expressing those solutions in code. Likewise, the hard part of electrical engineering isn't reading circuit diagrams, or the hard part of being an author is knowing English. Thus, if you want to make programming your system easier, you don't throw out the idea of code. You solve as many of the hard problems as you can, and then supply those solutions as a library.
[+] [-] jamexcb|4 years ago|reply
[+] [-] throwme159|4 years ago|reply
[+] [-] munk-a|4 years ago|reply
[+] [-] jagged-chisel|4 years ago|reply
[+] [-] tediousdemise|4 years ago|reply
I have nothing nice to say about LabVIEW. It's the very reason I changed my career from electrical to software engineering.
[+] [-] ngrilly|4 years ago|reply
[+] [-] djmips|4 years ago|reply