Scientific research seeks to make inference about population parameters from data. The way this is done is to gather data, and use it to test and investigate a particular theory. The theory will often be formulated in terms of parameters which are unknown and need to be estimated from the sample data. Of course scientists will have some prior belief about these parameters. The Bayesian view of statistics provides a paradigm that scientists can implement to establish their latest beliefs about the parameters of a model. If we allow theta to represent the parameters of interest then we can imagine that these parameters are the settings on a machine that produces x values (the data). That is x is made under the influence of theta. This is symbolized as x|theta (x given theta). This is theoretically what happens -- the truth is that we don't know the theta values, we only see their effects in x. So what we really need to do is infer from the data the values of theta from x -- that is theta|x (theta given x).
The interesting thing about this paradigm is that it explicitly allows for prior beliefs to be included in the analysis. This is important because our past experiences may contain information that is applicable to the understanding we have of a parameter however it needs to be updated. The posterior is the updated prior and represents our latest belief.
One thing we don't want is to hold to a prior belief structure that would close down the possibility of new and more accurate information about the Lord's instructions to us in this age. Baye's rule is expressed mathematically by the formula below. The left side is the posterior, i.e the latest update, the right side shows how this is calculated, the prior is p(theta) and is the investigator's prior beliefs, f(x|theta) is the likelihood and provides a means for how the information in the data enters the update process, the denominator is the evidence and is used to form the proportionality constant ensuring the correct scale. From a Bayesian point of view the likelihood and prior are the two ingredient components needed to form the posterior. We must therefore examine both carefully in the light of a Bayesian model that functions to update our understanding of interesting parameters that effect all of us today.
A Biblical Example of a Bayesian Update
Bayesian ideas are found in our arguments and logical interactions in a multitude of genres and it is really no surprize that they are also found in the Bible. In the 18th chapter of Acts the writer introduces us to Apollos who was a Jew and mighty in the scriptures. The relevant passage begins in verse 24:
24 And a certain Jew named Apollos, born at Alexandria, an eloquent man, and mighty in the scriptures, came to Ephesus.
25 This man was instructed in the way of the Lord; and being fervent in the spirit, he spake and taught diligently the things of the Lord, knowing only the baptism of John.
You can clearly see the prior that Apollos had -- while instructed in the law, mighty in the scriptures and eloquent he knew only the baptism of John. Hi prior p(theta) went only as far as John's baptism.
26 And he began to speak boldly in the synagogue: whom when Aquila and Priscilla had heard, they took him unto them, and expounded unto him the way of God more perfectly.
Aquila and Priscilla gave him an economical update so that he became acquainted with the way of God more perfectly. Now consider how much greater update we who have had the "mystery" revealed to us can perform on those who have ears to hear.
27 And when he was disposed to pass into Achaia, the brethren wrote, exhorting the disciples to receive him: who, when he was come, helped them much which had believed through grace:
28 For he mightily convinced the Jews, and that publickly, shewing by the scriptures that Jesus was Christ. (Act 18:24-28 KJV)
After obtaining this update Apollos functioned with a new current state of understanding which we call the posterior (p(theta|x)) and with this understanding "helped them much" and "mightily convinced the Jews"