Jump to content

On Percentual Error of a 0.0 Value


Procrastination

Recommended Posts

Okay so this may be impossible, wrong, ridicolous, I don't really know but my homework is stating that I have to find the percentual error of certain values and one of them is 0.0. The value that I'm talking about is the first one and it is measured in Volts. The minimum scale of the voltmeter is 0.1. I know that to find the percentual error of something I have to take the absolut error which in this case would be 0.1, divide it into the actual value which would be 0.0 and multiply it by 100. Now, I don't get it. How am I supposed to divide by 0.0. Blame my teacher, this is wrong right? or is it possible to take the percentage error of this with another method?

Edited by Procrastination
Link to post
Share on other sites

Okay so this may be impossible, wrong, ridicolous, I don't really know but my homework is stating that I have to find the percentual error of certain values and one of them is 0.0. The value that I'm talking about is the first one and it is measured in Volts. The minimum scale of the voltmeter is 0.1. I know that to find the percentual error of something I have to take the absolut error which in this case would be 0.1, divide it into the actual value which would be 0.0 and multiply it by 100. Now, I don't get it. How am I supposed to divide by 0.0. Blame my teacher, this is wrong right? or is it possible to take the percentage error of this with another method?

Hmmm that is strange! Are you sure its not the other way around? If not have a look at this: http://wiki.answers.com/Q/How_do_you_calculate_the_mean_absolute_percent_error_with_a_0_actual_value

I think its telling you to do 0.1 / 0.01 i.e. 0.01 being closest to zero "with the number of decimals appropriate to the accuracy of the measurement.

Edit: Might be a better idea - http://answers.yahoo.com/question/index?qid=20080220192525AAJcwgA

Edited by Keel
Link to post
Share on other sites

Okay so this may be impossible, wrong, ridicolous, I don't really know but my homework is stating that I have to find the percentual error of certain values and one of them is 0.0. The value that I'm talking about is the first one and it is measured in Volts. The minimum scale of the voltmeter is 0.1. I know that to find the percentual error of something I have to take the absolut error which in this case would be 0.1, divide it into the actual value which would be 0.0 and multiply it by 100. Now, I don't get it. How am I supposed to divide by 0.0. Blame my teacher, this is wrong right? or is it possible to take the percentage error of this with another method?

Hmmm that is strange! Are you sure its not the other way around? If not have a look at this: http://wiki.answers.com/Q/How_do_you_calculate_the_mean_absolute_percent_error_with_a_0_actual_value

I think its telling you to do 0.1 / 0.01 i.e. 0.01 being closest to zero "with the number of decimals appropriate to the accuracy of the measurement.

Edit: Might be a better idea - http://answers.yahoo.com/question/index?qid=20080220192525AAJcwgA

I know right?! Im like :blink: . Anyways, that certainly helped a little bit. However if I take into account "with the number of decimals appropiate to the accuracy of the measurement" it would be only 1 decimal. Something like 0.1 so I can't just put 0.01 because it would be against the previous statement. I think I'll just write undefined. :blink: (My teacher is crazy)

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...