To the best of my understanding the ability to re-score miss-keyed responses is a feature request (https://ideas.digitaled.com/) and not part of the current release. Also, and this part is something I am not 100% sure of, if Student A attempted the original question they will see the original question in their gradebook. While if Student B attempted the "revised" question they will see the "revised" question in their gradebook. It would be far more confusing if everyone saw the revised question, even though half of them were graded with miss-keyed answer. To this end, I would recommend using the "edit question comment" field when manually correcting the miss-keyed student responses.
Although the Student ID cannot be pulled into the algorithm, there are some other great options....
1) As Meta mentioned, you can use a combination of Adaptive Assignments utilizing Maple Repositories
2) You can generate a "pseudo-student" id, using a response from a prior question and then generate a number from that string.
For example: I built an assignment with the following details:
Question 1: Asks the student to write their name in the box: "I, ___________, certify that this is my assignment". -If a student writes a name that does not appear in the comma separated list, their first question is highlighted (as it is graded in-correct).Question 2: Asks the student to write in an ascii-string that resembles their response to question 1 (ie. name): "take each letter in your name and convert your name to an ASCII string that is padded to a total width of 4-digits per letter: ___________________" -we use Maple (inside a Maple-graded response) to convert all the possible student names to the same ascii-string and then check to see if the student response is correct (based on their response to the first question).Question 3: Asks the student to (mod 5) each character in their second answer: "Take each character in the ASCII string above, find mod 5 of the character, and indicate the resulting string below: ___________" -Again we use Maple, to check the students supplied answer against what they should have answered based on the first question.
Basically the above allows you to completely customize an assignment based upon what the student responded in the first answer box, on the first question, (I recommend you think about using a table/Matrix with the first column as student names, and each column thereafter as answers to Question#1, Question#2, etc.)
Here is a Mobius Module that demonstrates the above: Module.zip
The Display command uses the Typesetting package native to Maple Document mode. However since Möbius is a web app, everything needs to be converted into MathML (or LaTeX or similar) to be supported by the browsers.
The closest I can think of (with this method) is the following: $a=maple("use InertForm:-NoSimpl in a:=2(3*x) end use; printf(InertForm:-ToMathML(a));");
But I don't know if that will generalize well to the rest of your problem.
One solution to this would be to hold the answer in a variable. For example to calculate power = energy/time we could use variables such that $P = $E/$T. In the response area (assuming you are using numeric) you can then set "Required with" to "absolute accuracy". If you are using randomisation such that $P would be to more than two significant figures you can use the command "decimal(n,x)" in the algorithm section where x is the input and n is the number of decimal places. In this case it would be decimal(2,$P).
It seems that they have an iframe embedder feature, so it's pretty much copy-paste of their source code into source code of a Mobius question. I've attached a TA question with one of their demos as an example: Example.zip
If you're comfortable with Maple programming language, you can try 'inert form'. It will allow you to grade student response in its original form, without simplifications. Click here for an example.
Alternatively, you can convert the response to "string" and use string tools to check if the expression contains (the right number of) brackets, this should allow you to identify the expression type. I'm thinking alongs the lines of: student answer is equal to the correct answer AND the number of "(" brackets is right.
With regards to "y=" part. It's hard to say without the actual grading code. You could try to pull the the equation apart with use of lhs and rhs commands. For example, "y= a + b" would give you "y" and "a+b" respectively. This should allow you to grade equation in two parts without equation comparisons.
When assigning variable names to the equations (Maple syntax) I often forget to use ":", for example: "my_equation = y = a+b" instead of "my_equation := y = a+b". This leads to similar error messages.
The 40px spacing applies to the iframe for the entire HTML question. Whilst your trick worked for me, I'd suggest moving all of your Question Text to inside the Question HTML: part of the response area: