

Moderator: cnckeith
I am sure Acorn and CNC12 do the job well. I agree that there is way more MMI and support for all things CNC. It absolutely looks to be a polished and professional DIY option, and that may be enough for most. I was just trying to go a level deeper. I was asking about a fundamental design element to the solution. None of what you have stated, and only minimuly what the Centroid team has stated, explains the need for tight coupling of the PC and controller. Estops and the like are outside this discussion as they should be hardwired, and trump anything the control system does. Having a tight coupling for the real-time job processing between the PC and controller is adding a link in the chain (another potential failure point) that I propose may not be needed. Oh course, you want to have near real-time info about the job, and where needed intervene with manual control, but none of that requires the PC to be doing real-time processing of the job's data (g-code) or buffering. Preprocessing maybe, but not real time. I fully agree that the link (for monitoring and manual control) should be up at all time and monitored for failure, but should a failure (possibly temporary) in that link cause a job failure? I realize it was stated that it may recover, the job may finish, it all depends. Can someone point me to where in the manual it explains how to recover a job that fails in this way? I am really asking not making definitive statements here. I am the newbe simply asking, why is it done this way. I am hoping someone can provide a use case to why this is needed and how it is better than what I have proposed.tblough wrote: ↑Mon Jan 08, 2018 5:55 pm The Acorn is designed for much more than a little 3D printer where if something bad happens you loose a little plastic. The Acorn can control big iron where a mishap can cause ten of thousands of dollars worth of damage, lost fingers, and significant lost revenue. The PC is the user interface. A machinist need to know feed rate and spindle speed and may need to adjust both as the cut progresses. The backplot lets the machinist know the how far along the program is, and the screen displays information on tool changes and lets the operator adjust the tool information to compensate for tool wear.
Subtractive machining is a lot more involved at the machine than additive manufacturing where all of the heavy lifting is done during pre-processing. Despite the price, this is not a Rambo or a Smoothie controller board. This is a professional level control.
You do realize that paper deals with toolpath creation - i.e. going from geometry to g-code, NOT executing the g-code. Toolpath creation is a COMPLETELY different can of worms, but one that is reasonably amenable to parallel processing, as the toolpaths for different parts of the work can be generated reasonably, often totally, independently. For many reasons, actually EXECUTING g-code is not nearly as amenable to parallel processing, as it is inherently a somewhat more serial process, because what happens in each step is often dependent on what happened in previous steps.mikes wrote: ↑Wed Jan 10, 2018 6:53 pm The little investigating I have done thus far does show that this can be very computationally expensive stuff, and in fact the next big hurdle to speeding the generation of tool paths is parallelization of the problem set. Based on Centroid's single threaded specs for the PC, I suspect they have yet to fully crack that nut (at least for this line of controllers). It was stated that much of the algorithms were designed to run on the processors of the past, that in many cases they are still sequentially processed in a single thread. So, maybe that is partly the answer to my question. If parallel processing were employed, this could potentially be much faster. I was reading about this in a scholarly paper from Clemson University. Here is the link in case you are interested: https://tigerprints.clemson.edu/cgi/vie ... sertations