FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > Religion (Closed) > Biblical Criticism & History
Welcome, Peter Kirby.
You last visited: Yesterday at 03:12 PM

 
 
Thread Tools Search this Thread
Old 03-11-2007, 08:56 PM   #181
Banned
 
Join Date: Jan 2007
Location: Canada
Posts: 528
Default

Quote:
Originally Posted by Julian View Post
[MOD]
Okay, this thread is degenerating into personal attacks and sniping. Almost everybody is guilty of this, including yours truly. Please return to discussing the topic or the thread will closed.

Julian
Moderator BC&H
[/MOD]
First, hats off to you for admitting your negative contribution. I appreciate this.

Second, I agree. It would be great if the moderators started deleting cheap baiting messages that have no topical content, not just editing unsatisfactory responses.

Lets get back to John 8:1-11, and if necessary reprimand those trying to derail the thread constantly.

The current sub-topic is PCA (Principal Component Analysis as applied to NT textual crit, 8:1-11 in particular). But comments about and analyses of previously posted internal evidence are welcome.
Nazaroo is offline  
Old 03-11-2007, 09:20 PM   #182
Banned
 
Join Date: Apr 2005
Location: Queens, NY
Posts: 2,293
Default

Quote:
Originally Posted by Nazaroo
PCA is a 'fad' technique. Something that a lot of 'soft/social science' fields are turning to right now, to give their flakey garbage more 'scientific credibility'.... The problem is, most of the 'experts' in these fields, (like those of textual criticism), are definitely not mathematicians, let alone statistical experts. (You have to be competant in statistical mathematics by the way, to be a physicist.)....What happens is, shoddily designed 'experiments' are sprinkled with 'powerful techniques', and the result is crap....
Just a little aside. This is very similar to what happenned with the "Jesus Family Tomb". Call in a Professor of Statistics as wrapping paper yet the whole methodology is faulty and handed to him as unalterable. Don't try to really get to the core issues. Use the techniques (PCA, statistical calculation) for the agenda. I grant that the PCA stuff is not of much interest to me personally.

Keep in mind that even a "statistical expert" may not really have the methodology and design function in hand. Once, a little younger, I was the rating statistician for the United States Chess Federation. And I knew diddles beyond crunching the numbers, Elo was the system designer.

It is a bit like the difference between a limited computer programmer and a good systems analyst. The programmer can just be a bean counter equivalent but to design the system properly takes some savvy and understanding. Seeing the big picture properly.

That is where Tabor and Simcha failed, from ignorance or agenda. Their understanding of post-facto probability design was very, very limited, based on the presentation and the posts from Tabor since.

That is one place where the PCA analysis from Willker falls, design. It gives us nothing in regard to the basic questions of authenticity. A lot of circularity and fluff. The bullseye was already made and ready to be moved to the arrows.

Shalom,
Steven Avery
Steven Avery is offline  
Old 03-12-2007, 12:23 AM   #183
Banned
 
Join Date: Apr 2005
Location: Queens, NY
Posts: 2,293
Default

Professor Maurice Robinson has an assertion that the wide and overwhelming preponderance of Byzantine readings by the 11th century is essentially proof that this was the ancient text (this takes in factors like the hand copying and geographical distribution .. I think I am doing a reasonable representation).

Now that is in essence a type of vector & transmission analysis. (I'm not saying this is easy to prove, however he makes a lot of sense when you start contemplating the nature of transmission and how difficult it becomes for a variant to take over a diverse textline.)

Now if there is a way to place this evidence in a rigorous fashion, that would be interesting. Or more visible. This applies to the Pericope too. Have you dealt with this question ?

Shalom,
Steven
Steven Avery is offline  
Old 03-12-2007, 01:27 AM   #184
Veteran Member
 
Join Date: Jun 2004
Location: none
Posts: 9,879
Default

Quote:
Originally Posted by praxeus View Post
[COLOR="Navy"]Professor Maurice Robinson has an assertion that the wide and overwhelming preponderance of Byzantine readings by the 11th century is essentially proof that this was the ancient text (this takes in factors like the hand copying and geographical distribution .. I think I am doing a reasonable representation).
Professor Robinson also is a committed Evangelical. Taking "Robinson said so" is no easy matter.

Quote:
Now that is in essence a type of vector & transmission analysis. (I'm not saying this is easy to prove, however he makes a lot of sense when you start contemplating the nature of transmission and how difficult it becomes for a variant to take over a diverse textline.)
It's not that hard. Plato quotes a version of the Iliad that we don't have today. The Dead Sea Scrolls show a plethora of diverse texts.
Chris Weimer is offline  
Old 03-12-2007, 03:50 AM   #185
Banned
 
Join Date: Apr 2005
Location: Queens, NY
Posts: 2,293
Default

Quote:
Originally Posted by Chris Weimer
Professor Robinson also is a committed Evangelical. Taking "Robinson said so" is no easy matter.
I'm not sure of your point here. The question is whether his transmission understanding makes sense and is the strongest theory. The common view has lots of problems, which is why they invented the Lucian rescension theory, now generally discarded.

Quote:
Originally Posted by Chris Weimer
It's not that hard. Plato quotes a version of the Iliad that we don't have today. The Dead Sea Scrolls show a plethora of diverse texts.
A vector transmission analysis goes far beyond a copy or two. It is working with why there are hundreds of hand-copies in agreement over a wide range.

Shalom,
Steven Avery
Steven Avery is offline  
Old 03-12-2007, 04:32 AM   #186
Veteran Member
 
Join Date: Feb 2004
Location: Washington, DC (formerly Denmark)
Posts: 3,789
Default

What Chris is talking about regarding M. Robinson is that, although he is very knowledgable and does some excellent and useful work, very few people take his stance on the majority text seriously. By this I mean no disrespect towards him since his contributions have been better than most. It is merely that evangelicals and apologists are frequently guilty of a tail-wagging-the-dog approach to religious problems and when confronted with a critique of this usually respond with a 'tu quoque' fallacy.

While some parts of the byzantine text-type are obviously early/original it simply cannot be placed early enough in the quantity that would be necessary to make it a viable theory. Only wishful thinking can do that. Besides this is an issue only important to inerrantists and literalists, a group not to be taken seriously. Even the 'all the early manuscripts are from Egypt' critique fails to establish a byzantine trajectory. This is a problem with all the religiously motivated attacks on current scientific understanding: the destruction of the established vector (even had this been accomplished) does nothing to establish a new one.

Saying that variants are hard to establish is just plain silly in light of the manuscript evidence of the bible. Look at the western text for starters.

Prax is correct in noting that Willker's PCA study does nothing to help our understanding of the pericope under discussion. PCA is obviously not a 'fad' technique, like there could even be such a thing in mathematics, and eigen vectors and singular value decomposition and so on are all well understood, including by the scores of physicists that I know or talk to daily (every single one of whom uses their real names). All that Willker's study shows is that it is statistically possible to show the direction and magnitude of manuscript separation. That's all.

I have done extensive statistical analysis on the Greek text of the NT and on this pericope in particular. So far it has all been inconclusive. I will have one more go at it which I think will tilt the likelihood one way or another but will almost certainly not approach anything like 'certainty.' The new approach will involve syntactical and semantic stylometrics which historically give you better numbers but is not generally available.

Julian
Julian is offline  
Old 03-12-2007, 05:06 AM   #187
Banned
 
Join Date: Jan 2007
Location: Canada
Posts: 528
Default

Quote:
"To transform a table of numbers into PCA 'maps' we first create a model of the data, in a multi-dimensional Phase (Object) Space. Then we use a Normalization technique to size and shape it, a Transform method to orient it, and a Projection method to flatten it onto an X-Y plane.

A PCA 'shadow projection' can tell us a lot about a situation, even though the information has been greatly simplified or reduced. But we should always keep in mind that severe distortion and information loss are inevitably involved in PCA methods.

This is a 'lossy' process, meaning that some information is completely lost in going from data table to picture. This results in both missing cues in the picture, and artificial mirage-like artifacts or distortions. The result can be severely misleading, and any 'discovery' appearing in a projection must be independantly verified using other techniques."

- Nazaroo
...From my upcoming online article, "Why PCA is Bullshit".


Quote:
Originally Posted by Julian
I have done extensive statistical analysis on the Greek text of the NT and on this pericope in particular. So far it has all been inconclusive. I will have one more go at it which I think will tilt the likelihood one way or another but will almost certainly not approach anything like 'certainty.'
If your results are inconclusive, its because you don't know what you are doing, where you are going, or why.

Your experimental design is obviously hopelessly unfocussed and fatally flawed.

Its not rocket-science to establish probable knowledge or probable authorship in a trivial case like this. We do it all the time in fingerprint I.D.ing and Email analysis, face and voice recognition, and license-plate recording.

You haven't even addressed the extensive pattern-matching evidence I posted regarding the internal evidence from John's Gospel, the chiastic structures, and symmetry-breaking as a result of the Aleph/B omission. You haven't explained how John could have embedded extensive 'tamper-retardant' features into his Gospel, all of which are damaged but not removed when the PA is removed, and yet not be the one who included it.

How is copious evidence of John's awareness of the passage 'inconclusive'.? ...

Keep in mind that all of your blather about a "new approach [that] will involve syntactical and semantic stylometrics" will be meaningless if John simply used an earlier tradition or source document for this passage, and included because he liked it.

There is abundant evidence elsewhere in John that he used sources and probable eye-witness accounts from other people (e.g. woman at well, Nicodemus (twice, counting ch 7), Lazarus' & family etc.)

So far, all previous attempts to separate the obviously disparate material using 'stylometrics' has utterly failed, not so much because John heavily stylized everything in his own hand, but (what is now almost universally recognized) John HAD NO STYLE. Even his Semiticisms and Judaean dialectical flavour is elusive to quantify.

All of this frustration and hard evidence in the form of null results by other researchers should have cued you 100 years ago that you're barking up the wrong tree.


The key to John is not in semantic or grammatical structures per se (micro-level analysis) but in its incredibly heavy structural content (MACRO-level analysis).

Nazaroo is offline  
Old 03-12-2007, 05:51 AM   #188
Banned
 
Join Date: Apr 2005
Location: Queens, NY
Posts: 2,293
Default

Quote:
Originally Posted by Julian
While some parts of the byzantine text-type are obviously early/original
Would you give any examples of those parts ? Do you only consider those to be only the places where they are in synch with one or both of the major Alexandrian manuscripts ?

Quote:
Originally Posted by Julian
it simply cannot be placed early enough in the quantity that would be necessary to make it a viable theory. Only wishful thinking can do that.
And Professor Maurice Robinson is one modern expert who gives the theoretical base that challenges this view. The proponents of your view are not particularly known for cogent responses and productive dialog on the issues raised. Gordon Fee offered a "hard case" challenge that Professor Robinson answered most excellently.

Quote:
Originally Posted by Julian
Besides this is an issue only important to inerrantists and literalists, a group not to be taken seriously.
Why the truth of the matter would depend on who is the person offering or accepting the alternative theories is a puzzle. Perhaps your idea is that only those who are looking for an errant text (as formulated in their a priori postulates) are relevant. Then you are expressing a philosophical - religious bias.

Quote:
Originally Posted by Julian
Even the 'all the early manuscripts are from Egypt' critique fails to establish a byzantine trajectory.
The simple fact of a couple of alexandrian manuscripts of course will not "establish a byzantine trajectory" by itself. It is trying to analyze what would give the agreement among the hundreds of diverse hand copied manuscripts over a wide geographical area that would establish the trajectory. Which is precisely why I referenced the theories of Professor Robinson and asked if they had been utilized in any modern models.

Quote:
Originally Posted by Julian
This is a problem with all the religiously motivated attacks on current scientific understanding: the destruction of the established vector (even had this been accomplished) does nothing to establish a new one.
That might be a phrase of significance if two manuscripts make a vector. In fact the modern version text was essentially a proof-text methodology using two manuscripts, in great disagreement to one another and scribally very corrupt (Sinaiticus horribly so).

Quote:
Originally Posted by Julian
Saying that variants are hard to establish is just plain silly in light of the manuscript evidence of the bible. Look at the western text for starters.
You are mixing apples and oranges. There are a small number of manuscripts involved (that may have had a special Latin influence) unlike the many hundreds in the diverse Byzantine line. Why not tell the forum the precise number of extant manuscripts involved and identify them, including language.

Quote:
Originally Posted by Julian
Prax is correct in noting that Willker's PCA study does nothing to help our understanding of the pericope under discussion.
Well that agreement, finally, is good to hear.

Quote:
Originally Posted by Julian
All that Willker's study shows is that it is statistically possible to show the direction and magnitude of manuscript separation. That's all.
Right. Unfortunately he taints his own efforts by adding a nonsensical value component, clearly considering those groups closer to the "reconstructed autograph" as superior. Then in future analysis those superior manuscripts can be given more weight than those that are inferior. This is simply an expansion of the modern textcrit game that is begun with Aleph and B. First declare them as "neutral" and the "earliest and most reliable" and then judge other manuscripts by their fealty to the two corrupt textual darlings.

Shalom,
Steven Avery
Steven Avery is offline  
Old 03-12-2007, 05:57 AM   #189
Banned
 
Join Date: Jan 2007
Location: Canada
Posts: 528
Default

Quote:
Originally Posted by Julian View Post
...regarding M. Robinson ..., although he is very knowledgable and does some excellent and useful work, very few people take his stance on the majority text seriously. By this I mean no disrespect towards him since his contributions have been better than most. It is merely that evangelicals and apologists are frequently guilty of a tail-wagging-the-dog approach to religious problems and when confronted with a critique of this usually respond with a 'tu quoque' fallacy.
It is deeply significant when well-known experts like yourself evaluate other notable and respected scholars in the field.

Your negative evaluation of Robinson is profound and clearly sends up a red flag to all non-experts and outsiders who would mistakenly take Robinson's credentials into account and give his naive and flawed methodology unwarranted credance.

Luckily you follow up with helpful guidance for novices immediately:


Quote:

While some parts of the byzantine text-type are obviously early/original it simply cannot be placed early enough in the quantity that would be necessary to make it a viable theory. Only wishful thinking can do that.

Besides this is an issue only important to inerrantists and literalists, a group not to be taken seriously. Even the 'all the early manuscripts are from Egypt' critique fails to establish a byzantine trajectory. This is a problem with all the religiously motivated attacks on current scientific understanding: the destruction of the established vector (even had this been accomplished) does nothing to establish a new one.
emotive and prejorative language has been highlighted.

Thank goodness you have properly exposed Maurice Robinson as a religious quack, who somehow adopted the camoflauge of an academic with a 30 year career, slipping through the cracks.

Your expert opinion has been greatly enhanced around here by these insights.

[/IRONY]

Quote:
PCA is obviously not a 'fad' technique, like there could even be such a thing in mathematics, and eigen vectors and singular value decomposition and so on are all well understood, including by the scores of physicists that I know or talk to daily (every single one of whom uses their real names).
PCA is very much a fad technique in the soft sciences and social sciences. In many cases it is properly applied with good results, through consultation with statistical experts and experiment designers, however this is not always the case.

When techniques which are 'old' in a field like mathematical analysis are applied for the first time in new fields, often 'over-applied' due to the inexperience and enthusiasm of the person bringing them into the new field, we can fairly talk about 'fad' techniques and applications.

Not in mathematics, which was NOT what I said, but in for instance social sciences, where PCA began to be applied in the 1990s.

PCA in various forms was not 'invented' by Pearson in the 90s, but was around even in Newton's time.


PCA Projection of a modern Textual Critic:



Carefully note the Eigenvalues.
Nazaroo is offline  
Old 03-12-2007, 07:26 PM   #190
Veteran Member
 
Join Date: Feb 2004
Location: Washington, DC (formerly Denmark)
Posts: 3,789
Default

Quote:
Originally Posted by Nazaroo View Post
...From my upcoming online article, "Why PCA is Bullshit".
Ah yes, a title that is highly representative of the kind of title a real physicist would chose to lend credibility to their work.
Quote:
If your results are inconclusive, its because you don't know what you are doing, where you are going, or why.

Your experimental design is obviously hopelessly unfocussed and fatally flawed.
Really? That's an interesting observation considering that you know nothing about my design or what I do, or do not, know. But then again, you are the physicist, you know far more than I.
Quote:
Its not rocket-science to establish probable knowledge or probable authorship in a trivial case like this. We do it all the time in fingerprint I.D.ing and Email analysis, face and voice recognition, and license-plate recording.
Except, of course, that fingerprints are unique and the rest carry a level of uncertainty. Well, maybe not that last one, since I have no idea what 'license-plate recording' might be. As for establishing authorship, well, if it is so trivial then how come you have already done so? I mean, why are scholars still debating this if you can trivially provide evidence that can establish probable authorship? Or is 'probable' your way of saying 'uncertain?'

By the way, I know a quite a few professors in Computational Linguistics who have devoted much of their lives to determining authorship of texts. Even with long texts where authorship is known they have a hard time coming up with great numbers. For smaller sections they are hard pressed to do better than 50-50. Maybe you, as a physicist, can show them how ignorant they are and show them how trivial it is? How about you start by lecturing these professor-type guys on how little they know about Greek authorship determination: Automatic Authorship Attribution. By the way, their approach is very similar to what I am proposing with a few twists. I am hopeful you will point out how futile this design is and point me (and NLP experts) in the right direction.
Quote:
You haven't even addressed the extensive pattern-matching evidence I posted regarding the internal evidence from John's Gospel, the chiastic structures, and symmetry-breaking as a result of the Aleph/B omission. You haven't explained how John could have embedded extensive 'tamper-retardant' features into his Gospel, all of which are damaged but not removed when the PA is removed, and yet not be the one who included it.
I have yet to see a convincing demonstration of pervasive chiasm usage in the gospels. And, if it is not pervasive, it becomes statistically insignificant. Of course, you knew that.

Also, I think too much importance is attached to Sinaiticus and Vaticanus. I still think they are top-notch manuscripts, however. An assessment I am sure you find ignorant.

Of course, it would be tempting to point out that anyone seeking to add a forged passage to GJohn here might look at the text in the general vicinity and, having it in front of him and fresh in his mind, seek to emulate and duplicate the effort. One might suggest that too much similarity is not a good thing either.

And then there is Henry J. Cadbury's A Possible Case of Lukan Authorship (John 7 53-8 11) HTR Vol. 10, No. 3 (Jul., 1917), pp. 237-244. I am sure you can explain why his views are wrong and you are correct.
Quote:
How is copious evidence of John's awareness of the passage 'inconclusive'.? ...
I guess people's definition of 'copious' varies. For the tail-wagging-the-dog-type it would seem that 'copious' = 'agreement.' But then again, I am no physicist and deeply ignorant.
Quote:
Keep in mind that all of your blather about a "new approach [that] will involve syntactical and semantic stylometrics" will be meaningless if John simply used an earlier tradition or source document for this passage, and included because he liked it.
Good thing that you have already shown that this is not the case. My method may help influence the perception of authorship, not whether or not the author of John like to cut and paste his material.
Quote:
There is abundant evidence elsewhere in John that he used sources and probable eye-witness accounts from other people (e.g. woman at well, Nicodemus (twice, counting ch 7), Lazarus' & family etc.)
Hmmm, 'abundant.' Is that like 'copious?'
Quote:
So far, all previous attempts to separate the obviously disparate material using 'stylometrics' has utterly failed, not so much because John heavily stylized everything in his own hand, but (what is now almost universally recognized) John HAD NO STYLE.
I don't know... That hat he wore on Sundays was kinda cool!

Of course, you do realize that no style is a style. Mathematically, at least. But you knew that.
Quote:
Even his Semiticisms and Judaean dialectical flavour is elusive to quantify.
But earlier you say it would be easy to recognize his style. You said, "Its not rocket-science to establish probable knowledge or probable authorship in a trivial case like this." Yet now you say his style may sometimes be elusive? Are you teasing us before you unleash your trivial evidence?
Quote:
All of this frustration and hard evidence in the form of null results by other researchers should have cued you 100 years ago that you're barking up the wrong tree.
Please explain 'null results' in this context. In statistics, a definition of, and a subsequent measurement in accordance with, or divergent from, a null hypothesis, is a most excellent result. Are you saying that Student's t is bad?
Quote:
The key to John is not in semantic or grammatical structures per se (micro-level analysis) but in its incredibly heavy structural content (MACRO-level analysis).
And granularity and phrase resolution makes just no-never-mind to you? It's kinda like those Hollywood movies where a camera picks up a single pixel of a face but they can magically zoom in and enhance and retrive a highly detailed likeness. So, dazzle us, already!

Julian

P.S. BTW, I would love to know, and you can certainly refuse to answer, where you acquired your PHD in physics?
Julian is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 04:24 PM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.