You are both right. Creator (and other machine control software packages) simply send the temperature setpoint (target) to the microcontroller board, and the firmware is responsible for closed loop temperature control. The controller in the Marlin firmware is a straightforward Proportional + Integral + Derivative (PID) controller (plus autotuning capability --- more on that below), and the output of that algorithm adjusts the duty cycle (percent time ON vs. overall cycle time) of the heater (bed or extruder).
When close to setpoint, the temperature control is in a "linear" region, and the response is pretty good. Basically, the output duty cycle applies, on average, enough power to the heater to make up for the loss to filament and ambient. If the temperature drifts away from setpoint, the average power is adjusted. With Proportional action, the further from setpoint, the more the adjustment. The Integral action makes sure that any error over time gets "mopped up" -- it drags the average power by the horns to get back to setpoint. Derivative (rate) action makes sure if the actual temperature is changing quickly, more counter-adjustment is made, and if the actual temperature is changing slowly, less counter-adjustment is made.
The Marlin firmware has an "Autotune" capability. If you send the G-code to trigger the autotuner for a particular temperature, the firmware will make a "bump" in the setpoint, then see how the control reacts over time. Based on how soon the temperature starts to change (deadtime), how fast the temperature changes (process lag time) and how far the temperature changes (process gain), it calculates new "gain" parameters for the P, I and D terms of the control algorithm. It displays those values, and another G-code applies them, storing them in non-volatile memory (until you change them again).
When NOT close to setpoint, however, you are outside the "linear" region that all the math that goes into PID control theory depends on. When you first fire up the heater, the difference between the setpoint (temperature you want) and the actual temperature is so high, that the output of the PID function is 100%, and so the power is set to 100% ON time. Control systems engineers refer to this as a "saturated" output. (It happens because you have to add a lot of heat to get to temperature using a limited heat source.) When the output is saturated, you're basically "not in control". Worse, when you start heating, it takes some time for the temperature measurement to START to change, and once you have the heat on full, if you turn it off, the temperature keeps going up for a while, because the heater is still hotter than the measurement point for a while after you turn it off.
There are "Model Predictive Controller" (MPC) designs that can deal with this situation WAY better than PID control can. They basically make a mathematical model of the "process" (heater and hardware and measurement) and predict how the applied power will affect the temperature over time. MPCs can deal with the fact that there's some stored heat between the heater and the measurement point and reduce the power to the heater at just the right time BEFORE reaching setpoint so the temperature can "finish coasting up". A good MPC-based temperature controller can actually autotune itself as you turn it on by measuring the delay in the temperature starting upward when you first turn on the heater and measuring the maximum slope in temperature on the way up!
So why use PID over MPC? PID requires three parameters (gains) and some fairly straightforward math for rates of change, accumulators and such. The PID autotune is a bit more complicated, but not much. (BTW, PID autotune algorithms can be written to apply "hot" tuning, which gets to setpoint faster but has overshoot, or "damped" tuning, which takes longer to get to set temperature but doesn't overshoot. I don't know what type of response the Marlin PID autotune is striving for. But, again, this only applies in the linear region and doesn't deal with the initial overshoot due to saturation...) Model Predictive Controllers require much more data storage and statistical math to model the process, and similar for the MPC autotune. It may be that the ATMEGA MCU on the RAMBo board (as used on my M2) doesn't have enough extra ROM for the algorithms, or, more likely, enough RAM for the needed data storage. (There are also probably patent issues with some MPC algorithms.)
A simple workaround might be to have a "linear band" sequence for the PID. It could work something like this:
1. If the temperature is outside some defined band (say, +/- 10 degrees C) from target, apply a setpoint at the band edge.
So if you are going for 195 for extruder temperature, set the setpoint to 185.
2. When temperature just levels off the first time (supposedly at its maximum overshoot), change the setpoint to the final target value.
So, when the temperature zooms past 185 up to, say, 193, and then just starts to decrease, set the setpoint to 195.
You're basically doing an uncontrolled ramp-up to get to the linear region, then switching to PID control once you're semi-stable in the region.
You wouldn't need to change the firmware, I think. The Machine Control Panel in Creator could handle such a sequence with only one or two additional parameters in the FFF settings involving the size of the "linear band" to use. If it's +/- 10 degrees C, Creator would send a temperature setpoint of (target - 10) when the temperature is below the band and (target) when it's in the band. (You'd have to offset these each a bit so you don't get stuck at, say, 184.9 degrees when you're trying to get to 195. But, hey, implementation details....
Anyway, hope this long explanation helps. Meantime, I'll look around a bit and see if I can find a way to do some basic MPC that would not have too big a RAM or ROM footprint....
( http://www.isa.org/ http://www.isa.org/CAP