Procrastination Posted August 29, 2011 Report Share Posted August 29, 2011 (edited) Okay so this may be impossible, wrong, ridicolous, I don't really know but my homework is stating that I have to find the percentual error of certain values and one of them is 0.0. The value that I'm talking about is the first one and it is measured in Volts. The minimum scale of the voltmeter is 0.1. I know that to find the percentual error of something I have to take the absolut error which in this case would be 0.1, divide it into the actual value which would be 0.0 and multiply it by 100. Now, I don't get it. How am I supposed to divide by 0.0. Blame my teacher, this is wrong right? or is it possible to take the percentage error of this with another method? Edited August 29, 2011 by Procrastination Reply Link to post Share on other sites More sharing options...
Keel Posted August 29, 2011 Report Share Posted August 29, 2011 (edited) Okay so this may be impossible, wrong, ridicolous, I don't really know but my homework is stating that I have to find the percentual error of certain values and one of them is 0.0. The value that I'm talking about is the first one and it is measured in Volts. The minimum scale of the voltmeter is 0.1. I know that to find the percentual error of something I have to take the absolut error which in this case would be 0.1, divide it into the actual value which would be 0.0 and multiply it by 100. Now, I don't get it. How am I supposed to divide by 0.0. Blame my teacher, this is wrong right? or is it possible to take the percentage error of this with another method?Hmmm that is strange! Are you sure its not the other way around? If not have a look at this: http://wiki.answers.com/Q/How_do_you_calculate_the_mean_absolute_percent_error_with_a_0_actual_valueI think its telling you to do 0.1 / 0.01 i.e. 0.01 being closest to zero "with the number of decimals appropriate to the accuracy of the measurement.Edit: Might be a better idea - http://answers.yahoo.com/question/index?qid=20080220192525AAJcwgA Edited August 29, 2011 by Keel Reply Link to post Share on other sites More sharing options...
Procrastination Posted August 29, 2011 Author Report Share Posted August 29, 2011 Okay so this may be impossible, wrong, ridicolous, I don't really know but my homework is stating that I have to find the percentual error of certain values and one of them is 0.0. The value that I'm talking about is the first one and it is measured in Volts. The minimum scale of the voltmeter is 0.1. I know that to find the percentual error of something I have to take the absolut error which in this case would be 0.1, divide it into the actual value which would be 0.0 and multiply it by 100. Now, I don't get it. How am I supposed to divide by 0.0. Blame my teacher, this is wrong right? or is it possible to take the percentage error of this with another method? Hmmm that is strange! Are you sure its not the other way around? If not have a look at this: http://wiki.answers.com/Q/How_do_you_calculate_the_mean_absolute_percent_error_with_a_0_actual_value I think its telling you to do 0.1 / 0.01 i.e. 0.01 being closest to zero "with the number of decimals appropriate to the accuracy of the measurement. Edit: Might be a better idea - http://answers.yahoo.com/question/index?qid=20080220192525AAJcwgA I know right?! Im like . Anyways, that certainly helped a little bit. However if I take into account "with the number of decimals appropiate to the accuracy of the measurement" it would be only 1 decimal. Something like 0.1 so I can't just put 0.01 because it would be against the previous statement. I think I'll just write undefined. (My teacher is crazy) Reply Link to post Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.