|
Post by jonbain on Jul 17, 2021 12:28:14 GMT
The search for an objective foundation of the mind seems to be a never-ending Sisyphean nightmare. So we need to identify those aspects of mind that are undeniable (except to the habitual contrarian). Kant identifies the analytic and synthetic faculties. But where is emotion here? Creativity? It seems clear that in common discourse we separate the IQ from the EQ readily, and that logic and emotion are thus two halves of the mind. Can we then equate emotion with synthetic thought and call it creativity? (In opposition to logic). Plenty of folk deride emotions and creativity as being a weakness to be avoided. Priding themselves on their 'logical nature'. And thus they find themselves in a permanent nightmare of wars. War is illogical they might want to believe, but it is the culmination of logic alone to dominate or die. Then they weep for peace when defeated. So we return to psychology, and the most fundamental structure stripped of verbosity and endless texts: Jung :- The 5 quintessential faculties of mind epitomized by the ancient Greek Gods: Anger, Love, Sadness, Joy & Enlightenment. (The 5 visible planets: Ares, Venus, Saturn, Jupiter, and Mercury) Or Freud :- Id, Ego, Superego. Or we can turn to the most ancient source, the Vedics, and their excellent system of chakras: Wisdom, Intellect, Communication, Emotion, Geometry, Vitality, Health. 5, 3 or 7. Either way need to see these modes of awareness in a numerical format: a division of being, in order to describe them with words. But let us also see that Freud's separation is a dichotomy because more intrinsic is the observation of conscious and unconscious. The line between consciousness and unconsciousness is the famous: sub-conscious; being the point of becoming conscious. So is that now 9 faculties for Freud? Can you find a more useful way to describe mind than any of these? Remember to be foundational it needs to be reducible to fewest core components. But these need to deliver functionality in comprehending mind. It is hardly meaningful to study philosophy of mind without immersing oneself into the quagmire of psychology. And yet, any mind that has not suffered to rigour of math will flail about in the maelstrom of verbosity. And yet so much that passes for math is sophistry in the extreme without any ability or attempt to br rooted in the empirical world. The unforgiving world of computer algorithms will soon show up who can think in terms of genuine logic, and who follows its forms in style and applies random synthesis of labels which have the outside appearance, but lack all substance. This is why it is ironic that I have to conclude: If you cannot construct something like an evolutionary solar system, examining for yourself the categorical difference between Newtonian physics and Einstein's woeful attempt; if you cannot do something like that, then you will continue to flounder in wordiness and a clutching of straws by straw-men. ... and where psychology ends, thus begins: computer programming.
|
|
|
Post by Eugene 2.0 on Jul 17, 2021 14:34:40 GMT
"The unforgiving world of computer algorithms will soon show up who can think in terms of genuine logic, and who follows its forms in style and applies random synthesis of labels which have the outside appearance, but lack all substance".
That's for certain.
Also some comment on Jung. I haven't read him, except for his analysis of the Book of Job, but I got to say I didn't like that analysis. Why? Because I considered it to be far deliberated or not so morally complete. By the morally complete I understand showing some respect to the ancient authors. And I don't think that the Book of Job was the work of collective unconsciousness. We can't say that everything is made up by spirit, there are also material things, and that's why I look at Jungean psychology with some step away attitude.
I think that Kantian program was too much formalized, and that's why his analysis of the sentences was too dry to memoir anything from psychology (I mean his "Critique of Pure Reason" primarly). Some math proof could be translated to less formal and more phenomenological form, like the proof of Euler that E-V+F = 2, but it doesn't mean we cannot made of mistake there. And I guess if we want to be sure in something is the same as on the war: All if fair to war. I guess that the common understanding of what the dialectic is is close to this principle.
|
|
|
Post by jonbain on Jul 17, 2021 16:14:15 GMT
I agree about Kant, I did a ridiculously in-depth study of the critique of pure reason, in phil201, it lacked that emotional aspect, and that is why his attempt to define fundamental categories was only half-full.
|
|
|
Post by jonbain on Jul 17, 2021 16:34:07 GMT
I have not yet read Jung on Job, and I am unlikely to do so, as I reckon I completely understand Job already.
Job starts out by making a pointless sacrifice of an animal "just in case" one of his children sins in the future.
At the end of the book, God states that Job's suffering was for "no reason".
Just as the sacrifice was for no reason. God punishes Job for his arrogance in killing an animal for no reason.
So it is both "for no reason" and also for the most divine reason: karma.
Is it fair to kill an entire family for one animal? Yes. Its proportional. The animal sacrifice was infinitely worse than what went before it, as nothing pre-empted it.
Jobs family may be a million times more valuable than the animal. But the animal has infinitely more worth than nothing.
Its a very deep lesson on fear and paranoia escalating into a total disaster. Much like what is going on in the world today. "Be safe" we are told. But in the same way, paranoia has destroyed half the global economy.
The "coroner's virus" was only ever the common cold, plus people's fear and ignorance. The quackery and blatant pseudo-science of virology, as well as malicious neo-Darwinian ideology killed many. But the lies and statistics were only ever about profit. Never about health. Its most likely not going to end soon either as ignorance is at almost every person's mouth.
|
|
|
Post by Eugene 2.0 on Jul 17, 2021 16:54:51 GMT
I agree about Kant, I did a ridiculously in-depth study of the critique of pure reason, in phil201, it lacked that emotional aspect, and that is why his attempt to define fundamental categories was only half-full. Sorry that I response briefly (my devices are being heat during this indeed warm summer, and I have no air conditioning). a) if to say truly, Kant put some psychology into his analysis, but apparently he did it occasionally. One of example of his using psychology is in G. Frege's "The Foundations of Arithmetics" (in the first chapters). Frege explained that Kant wrongly took math as synthetic discipline, because his proof was based on psychological argument what had been aimed by another philosopher (I don't remember his name, but Kant mentioned him in his introduction - when he's discussing the analytic-synthetic problem.) b) my own interest is in this problem of using psychology in logic is really tied up with the questions: 1) how we're formalizing something without appealing to psychology? if we do this, why then all those dots, fuzzy semantics, etc? 2) of the proof - i.e. heuristics methods. When I'm trying to decide whether or not ~~( A v ~A)is derived in the minimal logic (i.e. which is closed to the constructive or intuitivistic logic), I'm not sure of using the previous theorem: A → (A v B) A → (A v ~A)
So, as long as such things occur we never know what to do. Unfortunately I'm not that good with the computer languages, and I have no real skills in codering/programming. I use to do it, but only with mentors and the elder helpers. I mean I haven't learned anything up to now. (That's why I didn't really understand your answer completely about the acceleration, etc. However, the answer was good.) c) I see that you've done too many things in philosophy, but to my shame I haven't read about it much, so that's why I'd please you to share some links of your works in philosophy, of course, if you would like to.
|
|
|
Post by jonbain on Jul 17, 2021 19:07:42 GMT
Eugene 2.0The best source for philosophy is to get a "Dictionary of Philosophy". I forget who wrote mine. But they tend to cut out all the verbosity so you can get a good summary, and thus know what to take further without having to slog your way through endless sophistry. This also got me into trouble, because when I pointed out that Confucius had been saying mostly everything Plato had been saying, but that that he said it 50 years earlier, they now decided I was a traitor. I do not understand all that formulaic stuff at all, sorry.
|
|
|
Post by joustos on Jul 18, 2021 15:44:52 GMT
I agree that where philosophy of mind ends, there begins (modern) psychology. At the same time, you could have quoted Aristotle in line with the "psychologists", especially his distinction between "nous poietikos" (Agent Intellect) and Passive Intellect. Anyway, all the psychologists you cite had some insight into the human mind, whether they posited an "objective foundation" or not. I will add a Medieval psychology (especially St. Bonaventure's): The intellective soul [above Aristotle's sensient/perceptive soul] or Mind is tripartite, an image of the Divine Trinity: creative/causing, understanding or possessing the ideas/concepts of things [the Logos, Reason], and emotive/affective/loving. {Historically this is the reverse of what the early Christian theologians had done, namely understanding God in the image of the human mind. -- A sculptor or artisan is always the model of the nature of the human mind.}// I don't understand the proposition that where the psychology of mind ends, there begins computer programming.
|
|
|
Post by jonbain on Jul 18, 2021 15:56:19 GMT
I agree that where philosophy of mind ends, there begins (modern) psychology. At the same time, you could have quoted Aristotle in line with the "psychologists", especially his distinction between "nous poietikos" (Agent Intellect) and Passive Intellect. Anyway, all the psychologists you cite had some insight into the human mind, whether they posited an "objective foundation" or not. I will add a Medieval psychology (especially St. Bonaventure's): The intellective soul [above Aristotle's sensient/perceptive soul] or Mind is tripartite, an image of the Divine Trinity: creative/causing, understanding or possessing the ideas/concepts of things [the Logos, Reason], and emotive/affective/loving. {Historically this is the reverse of what the early Christian theologians had done, namely understanding God in the image of the human mind. -- A sculptor or artisan is always the model of the nature of the human mind.}// I don't understand the proposition that where the psychology of mind ends, there begins computer programming.
Interestingly I equate Freud's: Superego, Ego, Id ... as ... Father, Son, Holy Ghost Which is also... Zeus, Poseidon, Hades
though Aristotle was a plagiarist at best, still the idea he stole could be worth something
the term 'passive intellect' reminds me of habits rather than the decisions to make or break habits
though this could be equated with Freud's unconscious and that can be traced all the way back to Plato's cave...
as for "where psychology ends, there begins computer programming" I could have said: once your emotions are in equilibrium, the task before you will be mathematical.
But "math" that does not exist in an evolutionary algorithm, these days is typically sophistry in almost all cases.
|
|
|
Post by Eugene 2.0 on Jul 18, 2021 16:23:40 GMT
Eugene 2.0The best source for philosophy is to get a "Dictionary of Philosophy". I forget who wrote mine. But they tend to cut out all the verbosity so you can get a good summary, and thus know what to take further without having to slog your way through endless sophistry. This also got me into trouble, because when I pointed out that Confucius had been saying mostly everything Plato had been saying, but that that he said it 50 years earlier, they now decided I was a traitor. I do not understand all that formulaic stuff at all, sorry. Oh, sorry for that. I'll explain. Ordinary propositional logic has some standard theorems. It can prove plain formulas, yet it can't allow prove from absurdity or on contrary, or kinda (this on contrary proving method is one of the shared in math). In the minimal logic there's an attempt to add just few new theorems to make able to prove from absurdity. But, instead of great expanding, the minimal logic we can just erase the negation sign from the last formula, aka conclusion, and if there are some negation in the conclusion, the proof has to end on contradiction. Along with it, there are some limits in the minimal logic, and one of it – to not deriving the semantical logic laws which are not plain (else: there's no need to use this minimal logic, and instead of careful adding new rules, we make it be not straight and not consequenced). "A v ~A" is a description for the law of the excluded middle that for the common and minimal logic can blow its semantics in some cases. A → A v B is allowing to derive A or B from any A, or if A is true, then A or B is also true. A → A v ~A is almost the same, but with a little confusion: putting "~A" into this formula we turn it to ability to prove "A v ~A" in the minimal logic, while this isn't allowed to do there. The reason of why we can't do this – is to make a more powerful conclusion, than we pretend previously. So, formalizing it in such a way makes our system of the minimal logic be more perplexed. The most annoying thing is how to escape using such formalizations as A→Av~A in proofs? So, there are some formulas in the minimal logic that has to be proven, and during the process of proof such formulas occur. On one hand, such lemmas can be maintained as strictly lemmas, on the other, we have to add them to our theorems. Either path doesn't seem good. It's sadly to hear that there are people who only because of the chronological fact call the other persons "traitors". They shouldn't have done it. I think that it has to be clear now that before Aristotle some schools in the Ancient India (and also in the Ancient China) as Nyaa knew logic forms and used it. I read one Moscow professor named Vladimir Shochin (https://vphil.jes.su/index.php?dispatch=authors.details&author_id=954&sl=en) 's book about the Indian philosophy and logic, and he brought some examples of Nyaa (and other schools) usage of logic long long ago. It had slightly different structure, but with all the main principle. Even further, the Indian schools of those times had wider conceptions about emptiness and nothingness.
|
|
|
Post by jonbain on Jul 18, 2021 17:37:09 GMT
Eugene 2.0 The best source for philosophy is to get a "Dictionary of Philosophy". I forget who wrote mine. But they tend to cut out all the verbosity so you can get a good summary, and thus know what to take further without having to slog your way through endless sophistry. This also got me into trouble, because when I pointed out that Confucius had been saying mostly everything Plato had been saying, but that that he said it 50 years earlier, they now decided I was a traitor. I do not understand all that formulaic stuff at all, sorry. Oh, sorry for that. I'll explain. Ordinary propositional logic has some standard theorems. It can prove plain formulas, yet it can't allow prove from absurdity or on contrary, or kinda (this on contrary proving method is one of the shared in math). In the minimal logic there's an attempt to add just few new theorems to make able to prove from absurdity. But, instead of great expanding, the minimal logic we can just erase the negation sign from the last formula, aka conclusion, and if there are some negation in the conclusion, the proof has to end on contradiction. Along with it, there are some limits in the minimal logic, and one of it – to not deriving the semantical logic laws which are not plain (else: there's no need to use this minimal logic, and instead of careful adding new rules, we make it be not straight and not consequenced). "A v ~A" is a description for the law of the excluded middle that for the common and minimal logic can blow its semantics in some cases. A → A v B is allowing to derive A or B from any A, or if A is true, then A or B is also true. A → A v ~A is almost the same, but with a little confusion: putting "~A" into this formula we turn it to ability to prove "A v ~A" in the minimal logic, while this isn't allowed to do there. The reason of why we can't do this – is to make a more powerful conclusion, than we pretend previously. So, formalizing it in such a way makes our system of the minimal logic be more perplexed. The most annoying thing is how to escape using such formalizations as A→Av~A in proofs? So, there are some formulas in the minimal logic that has to be proven, and during the process of proof such formulas occur. On one hand, such lemmas can be maintained as strictly lemmas, on the other, we have to add them to our theorems. Either path doesn't seem good. It's sadly to hear that there are people who only because of the chronological fact call the other persons "traitors". They shouldn't have done it. I think that it has to be clear now that before Aristotle some schools in the Ancient India (and also in the Ancient China) as Nyaa knew logic forms and used it. I read one Moscow professor named Vladimir Shochin (https://vphil.jes.su/index.php?dispatch=authors.details&author_id=954&sl=en) 's book about the Indian philosophy and logic, and he brought some examples of Nyaa (and other schools) usage of logic long long ago. It had slightly different structure, but with all the main principle. Even further, the Indian schools of those times had wider conceptions about emptiness and nothingness. Oh they were just ostracizing me because for not being racist. They were trying to make a pretext for genocide. They are busy doing this now in the name of "coroner's virus", which is little more than a human culling program. A direct violation of the Nuremberg Code. As for formal logic. I really do understand logic. They also tried to convince me that replacing words with such symbols somehow makes it clearer when it does not. We may as well replace one language with another, or change the font. It does not alter the logic at all, but instead gives them a sense of aloofness. My apologies, but I have long since rejected that notation as being clumsy and for the most part unreadable. Having learnt some half dozen computer languages, I really consider it superfluous to change English into such symbols. This is why I keep pointing out that the next step beyond philosophy is computer programming. Moreover, language is not always like math. A double negation does not produce a positive in many situations. Also, words have multiple meanings whereas proper math is unambiguous.
|
|
|
Post by Eugene 2.0 on Jul 18, 2021 19:12:35 GMT
Oh, sorry for that. I'll explain. Ordinary propositional logic has some standard theorems. It can prove plain formulas, yet it can't allow prove from absurdity or on contrary, or kinda (this on contrary proving method is one of the shared in math). In the minimal logic there's an attempt to add just few new theorems to make able to prove from absurdity. But, instead of great expanding, the minimal logic we can just erase the negation sign from the last formula, aka conclusion, and if there are some negation in the conclusion, the proof has to end on contradiction. Along with it, there are some limits in the minimal logic, and one of it – to not deriving the semantical logic laws which are not plain (else: there's no need to use this minimal logic, and instead of careful adding new rules, we make it be not straight and not consequenced). "A v ~A" is a description for the law of the excluded middle that for the common and minimal logic can blow its semantics in some cases. A → A v B is allowing to derive A or B from any A, or if A is true, then A or B is also true. A → A v ~A is almost the same, but with a little confusion: putting "~A" into this formula we turn it to ability to prove "A v ~A" in the minimal logic, while this isn't allowed to do there. The reason of why we can't do this – is to make a more powerful conclusion, than we pretend previously. So, formalizing it in such a way makes our system of the minimal logic be more perplexed. The most annoying thing is how to escape using such formalizations as A→Av~A in proofs? So, there are some formulas in the minimal logic that has to be proven, and during the process of proof such formulas occur. On one hand, such lemmas can be maintained as strictly lemmas, on the other, we have to add them to our theorems. Either path doesn't seem good. It's sadly to hear that there are people who only because of the chronological fact call the other persons "traitors". They shouldn't have done it. I think that it has to be clear now that before Aristotle some schools in the Ancient India (and also in the Ancient China) as Nyaa knew logic forms and used it. I read one Moscow professor named Vladimir Shochin (https://vphil.jes.su/index.php?dispatch=authors.details&author_id=954&sl=en) 's book about the Indian philosophy and logic, and he brought some examples of Nyaa (and other schools) usage of logic long long ago. It had slightly different structure, but with all the main principle. Even further, the Indian schools of those times had wider conceptions about emptiness and nothingness. Oh they were just ostracizing me because for not being racist. They were trying to make a pretext for genocide. They are busy doing this now in the name of "coroner's virus", which is little more than a human culling program. A direct violation of the Nuremberg Code. As for formal logic. I really do understand logic. They also tried to convince me that replacing words with such symbols somehow makes it clearer when it does not. We may as well replace one language with another, or change the font. It does not alter the logic at all, but instead gives them a sense of aloofness. My apologies, but I have long since rejected that notation as being clumsy and for the most part unreadable. Having learnt some half dozen computer languages, I really consider it superfluous to change English into such symbols. This is why I keep pointing out that the next step beyond philosophy is computer programming. Moreover, language is not always like math. A double negation does not produce a positive in many situations. Also, words have multiple meanings whereas proper math is unambiguous. Anyway, I think they should be worrying of loosing you, because if they're racists, nothing can't help them. Racism is a plague, maybe the one connected to some nin-satisfaction. To my school (in the middle school) came a black guy. He said he was Arabic, while some of his predecessors were from Africa (I don't know from where). (And before it a black girl was studying there also. Now it's more usual, but in the Eastern Europe it's rarely to have similar situation as in France or US in mixing classes.) And his name was Artyom. It happened I was the only his friend. Two years I'd spend friending with him. He was truly reader – moreover, he beat our school rating of the top readers. Perhaps, in my life I never met a reader like him. And some of the elder classes boys shaffed and joked to him. Many of them were trying to fight with him. I wouldn't say Artyom was a good fighter; he wore glasses, and his glasses were often broken. At the same time those boys from the elder classes were afraid of him, and mostly envying him. But I guess those bullies had many complexes (in psychological sense). And previously to the middle school I had a friend from Dagestan. So, I guess that tolerance was in my veins since childhood :) Speaking of those about logic, etc. Yes, with any doubts I accept your remarks and agree with them. Why did I need those minimal logic and the other stuff? – It's simple - it's like a mind trainer, some thinking exercises. Unfortunately I had known about logic too late. And I had been looking to it never knowing about its existence. If I were more logician thinker previously, it would change my life. I wouldn't wish to tie my life up with philosophy when I was yonger; I wanted to deal with electronics. In the beginner school I took my grandfather's books (Physics mainly) and tried to read them. I was running (at breaks) to higher classes teachers of Physics and Technology to ask them either about the work of a condenser or a transistor... You may think I'm crazy, but then – when I was a boy – I had no idea that math was what that helped me in physics. That's why later (not having good grades in Algebra; while having good in Geometry) I turned to Chemistry, and later dropped any wish to continue Physics. So, that was the real drama to me. I wanted to tie my life with the electronics, and I ruined my dream. If I read some other books about logic (Boolean Algebra; discrete math; etc) previously, I wouldn't be so confused reading some electronics. Because the last one has much more with math logic, than with math itself (especially speaking about the beginners). Even know reading another book of AI or kinda and viewing there logic I feel much more comfortable, than I hadn't heard about logic. Even Physics ideas come to me clearer when they're explicit (in logical sense). Let's say Kepler's laws formulations aren't clear for me while their logical explanation is clearer. The same for many things, including the evolution theory, the work of a computer, and so on.
|
|
|
Post by jonbain on Jul 18, 2021 20:07:12 GMT
Eugene 2.0One thing that has become apparent in solving the 3D-n-body-gravity problem is that the amount of articles out there that are senseless sophistry is just staggering. Scary. 99% of what gets a PhD in astrophysics for the last century is complete junk. Then one begins to detect the signs of such junk theory on a psychological/intuitive level. Like when you ask someone to prove a claim and then they just cite more claims without any proofs. The endless big words, the sophistry, the psychological side-stepping. How they ignore the logic you have just proved, and bring in some other irrelevant detail that sounds like it might have something of value. But does not. Then I apply this as inductive logic to other areas, like biology, evolution, archeology, and psychology, and I am shocked at how bad our academic systems really are. It literally makes me shed real tears. Often. In a nutshell: If they cannot solve for the very simple Newtonian gravity formula with just 9 bodies in 3 dimensions of space, then how can they claim to comprehend life itself? By 'they', I mean the Nobel committee. This is why I keep saying: if one does not completely understand my thesis on 3D-n-body gravity, then one does not comprehend science as a method at all. But this is really a problem of ethics. Given my premise of science having eroded for at least a century, it is of the highest ethical imperative to be able to see just how bad it is by properly building an evolutionary solar system. After all it was Newton, Galileo & Descartes the define the very method of science. And because that is almost entirely been eroded a disaster of global proportions is not only inevitable, but has been eroding the very moral fabric of society like a hellfire for the last year. All virology is worse than astrology. The current fiasco is little more than a witch-hunt where trial-by-water has simply been replaced by the "PCR" test. We are living in the holocaust 2.0
|
|
|
Post by thesageofmainstreet on Jul 19, 2021 18:09:36 GMT
I have not yet read Jung on Job, and I am unlikely to do so, as I reckon I completely understand Job already. Job starts out by making a pointless sacrifice of an animal "just in case" one of his children sins in the future. At the end of the book, God states that Job's suffering was for "no reason". Just as the sacrifice was for no reason. God punishes Job for his arrogance in killing an animal for no reason. So it is both "for no reason" and also for the most divine reason: karma. Is it fair to kill an entire family for one animal? Yes. Its proportional. The animal sacrifice was infinitely worse than what went before it, as nothing pre-empted it. Jobs family may be a million times more valuable than the animal. But the animal has infinitely more worth than nothing. Its a very deep lesson on fear and paranoia escalating into a total disaster. Much like what is going on in the world today. "Be safe" we are told. But in the same way, paranoia has destroyed half the global economy. The "coroner's virus" was only ever the common cold, plus people's fear and ignorance. The quackery and blatant pseudo-science of virology, as well as malicious neo-Darwinian ideology killed many. But the lies and statistics were only ever about profit. Never about health. Its most likely not going to end soon either as ignorance is at almost every person's mouth. From the Frying Pan Into the Fire The elitist tyranny's mind-control also creates and controls its own opposition, so beware of being forced into a rebellion that intentionally goes off-track. For example, a truly independent take on the crony virus is that auto emissions had killed off all viruses since they stopped the Spanish Influenza, so the Lockdown's reduction of them was solely responsible for this Green Holocaust. None of the appointed pundits in the weak, compromised, and misdirected opposition have come up with that explanation. Ironically, "thinking out of the box" exactly expresses the fraud. It comes from an imitates a puzzle that starts within the box, briefly goes outside it but too close by to be anything meaningful, then goes back into the Establishment's box canyon.
|
|
|
Post by jonbain on Jul 19, 2021 18:50:39 GMT
thesageofmainstreetFirst point is spot on. The only way around it is to remain an individual; not form a mob. The "coroner's virus" is the mistake of cause with correlation. Same as ALL virology. They assume viruses operate like bacteria. Not so. All claims to eradicate viruses are false, by simply renaming it as a new disease. classic example: polio = motor-neuron-disease. Viruses are the remnants of dead cells. They are the consequence of disease. The real cause can be many things, but mostly if your diet is correct, you will have a strong immune system and you will not "catch the common cold". Other reasons are toxins in the environment, and prions. The last is likely the real issue here. Prions are typically a consequence of human flesh mixed into your burgers.
|
|
|
Post by thesageofmainstreet on Jul 20, 2021 15:19:44 GMT
thesageofmainstreet First point is spot on. The only way around it is to remain an individual; not form a mob. The "coroner's virus" is the mistake of cause with correlation. Same as ALL virology. They assume viruses operate like bacteria. Not so. All claims to eradicate viruses are false, by simply renaming it as a new disease. classic example: polio = motor-neuron-disease. Viruses are the remnants of dead cells. They are the consequence of disease. The real cause can be many things, but mostly if your diet is correct, you will have a strong immune system and you will not "catch the common cold". Other reasons are toxins in the environment, and prions. The last is likely the real issue here. Prions are typically a consequence of human flesh mixed into your burgers. When Doctors Become Dictators, That's Really Sick
Another media lie is calling this a "novel" virus. It's been around for thousands of years. Long ago, the Chinese (only 4,636 deaths this time) became immune to it the hard way. The Mexicans (practically pure-blooded American Indians) left China before it started (Peking Man), so they are susceptible. Whites have had sufficient dealings with China for a thousand years to make them practically immune. So the American deaths are merely what used to be called "died of old age" back when doctors were honest and not part of a greedy clique that sought greater funding.
|
|