Hello I’m brand new to JavaScript and I’ve been working through the course and the exercises.
On exercise 16:
- Write a script, the script should calculate the total price for a user after a shopping trip.
- The items that can be bought are bread, milk, cheese, and yogurt.
The prices are as follows:
Bread: € 2.
Milk: € 1.5.
Cheese: € 4.
Yogurt: € 1.2.
I used the following asnwer:
var bread = "2";
var milk = "1.5";
var cheese = "4";
var yogurt = "1.2";
alert("Hi! Welcome to the Ivan on Tech Shop!");
var qtyBread = prompt("How much bread do you want?");
var qtyMilk = prompt("How much milk do you want?");
var qtyCheese = prompt("How much cheese do you want?");
var qtyYogurt = prompt("How much Yogurt do you want?");
var total = (bread*qtyBread + milk*qtyMilk + cheese*qtyCheese + yogurt*qtyYogurt);
console.log("The total amount to pay is", total);
The code works fine and gives the correct solution however, I noticed my answer was very different from the one given in the solution. Which is:
var bread = 2;
var milk = 1.5;
var cheese = 4;
var yogurt = 1.2;
var sum = 0;
alert("Hi! Welcome to the Ivan on Tech Shop!");
var input = prompt("How much bread do you want?");
sum += input * bread;
var input = prompt("How much milk (liter) do you want?");
sum += input * milk;
var input = prompt("How much cheese do you want");
sum += input * cheese;
var input = prompt("How many yogurts do you want?");
sum += input * yogurt;
console.log("The total amount to pay is", sum);
I was just wondering if anyone knew if it is better practice to define a variable, in this case sum as 0 and use that as a sort of running total if so why?