Nazism- A Syncretic Ideology with Left/Right Agnosia:

Left/Right Agnosia is a neurological symptom, often associated with Gestmann Syndrome (amongst other such disorders), where the subject is unable to distinguish between Left and Right. National Socialism, or Nazism as it is otherwise commonly known, has long been claimed, usually by scholars and academics with exclusively Left of centre political inclinations, to be a “Right Wing“, or a “Far-Right” ideology.

I would respectfully, but strongly dissent from this rather simplistic, and somewhat blinkered view of their ideological framework in favour of formulating a deeper and more thoughtful assessment of exactly who the National Socialists were, what they stood for and against, and how they might fit into the somewhat archaic, and often completely arbitrary Left/Right political spectrum.

I would firstly contend that the simplistic Left/Right dichotomy applies least to National Socialism of all the political ideologies. It is first and foremost a revolutionary ideology (something more often associated with Left wing doctrines), and it is neither reactionary nor evolutionary in nature, qualities that are more usually associated with being to the Right of the political centre.

Nazism is an anti-religious doctrine (unlike most other Right wing ideologies), and it is also fervently anti-capitalist, seeking to establish an idealised and novel variation of human society where the individual is entirely subservient to the state. National Socialism (“Nazism”) therefore was better described as the “Third Way”, as its proponents regularly claimed in their own words, and in their written manifestos of the time.

In view of this, it would be more accurate to refer to Nazism as a syncretic ideology, an amalgamation of political thought embracing elements of both the political Right and the Left. Nazism clearly embraces certain aspects of what was then contemporary Leftist thinking of its time- including such ideas/concepts as Eugenics, Social Darwinism, universal healthcare, Malthusianism, organic farming and macrobiotics, and radical environmentalism, as well as elements we might now associate with “the Right”- ultranationalism, racial purity and militarism in particular, although it could well be argued that many nominally “Leftist” juntas (e.g. North Korea, Communist China, Stalinist Russia) have embraced any or all three of these allegedly “Right Wing” characteristics at various times.

Hitler’s Political Origins Give Clues to His Ideological Leanings: 

What is particularly interesting about Adolf Hitler is that he did not arise from the political Far Right. He did not, for example, join the Freikorps (the “Free Corps”), bands of demobbed soldiers after the First World War who soon proceeded to mobilise around the country crushing any “socialist” insurrections, shooting Bolshevist Jews and so on. In spite of Hitler’s subsequent reputation, he never took part in any of that far Right activity at all.

Hitler instead was attracted to the far Left Munich Soviet Government led by Kurt Eisner. He was soon elected as a representative of his regiment to the “Soviet” council of this same “Soviet” government. The idea of workers, soldiers and peasants electing councils, which were then called “Soviets”, originally came from Vladimir Lenin and the Russian Bolsheviks. Lenin’s early battle cry was ‘All power to the Soviets’.1

In post-War Germany, after the fall of the Kaiser, Berlin’s Monarch Emperor, the Marxists and their fellow Left Wingers were inspired by the original idea of Lenin’s ‘Soviets’, and worker and soldier councils soon sprouted up as these Leftist regimes suddenly emerged in cities all around Germany, taking full advantage of the chaos of the times.

Adolf Hitler then took over a very small socialist party in Munich, called the German Workers Party, and renamed it the NSDAP, the National Socialist German Workers Party. He could do so because he had a strong and vocal following that invaded their meeting, who then voted him in as secretary.

Hitler then produced his 25 point programme, a manifesto which is remarkably socialist and Left leaning in many respects. The main thing that distinguishes it from Marxist socialism is that his programme had no internationalist idealism. Hitler cared nothing for other nations and instead wanted his brand of socialism to support the German worker exclusively.

From the NSDAP’s  “unalterable” 25 point 1920 program of the party- it proposed, among other things:

that all unearned income, and all income that does not arise from work, be abolished“; 

the nationalization of all trusts“; 

profit-sharing in large industries“; and 

an agrarian reform in accordance with our national requirements, and the enactment of a law to expropriate the owners without compensation of any land needed for the common purpose. The abolition of ground rents, and the prohibition of all speculation in land.”

In the Nazi 25-point program notice that they wish to replace Roman Law with Germanic Law.

Here, Hitler began to show an even more overt “Left Wing” ideological bent, in that he wanted to abolish the leisure class and, as stated above, all unearned income. People would only have the money they could earn with their own work, not that of others. The whole rentier class, and usury of any kind was therefore in his gunsights. He also demanded higher pensions for the elderly, and promoted the idea that the greater good of the community of the German people should be aimed for as a primary goal. Hitler placed a very strong emphasis in his ideology on the “German blood” community (Volksgemeinschaft), and of returning all German lands into the Reich, including his native Austria.

Even at this early stage, there is a strong disapproval of foreigners in his proposals. He believed that there should be no work for foreigners in Germany whilst there is still unemployment amongst German citizens. No foreign owned newspapers would be allowed that could try to pull the wool over German eyes or defend the interests of foreign powers. The only foreign papers that would be sold would be those not written in German, for tourists to buy or those studying those foreign lands.

Hitler by then had begun to more freely exhibit a marked streak of anti-Semitism, believing that the Jews were really not Germans, but were instead foreigners. Whilst this is tempting to ascribe as “Right Wing” ideologically, if one looks at the early French socialists those trends of anti-semitism and nationalist preference are strongly embedded in the 19th century.

In their own words: “Since we are Socialists, we must necessarily also be antisemites because we want to fight against the very opposite: materialism and mammonism… How can you not be an antisemite, being a socialist!”

In the mind of the Nazi, Jews were synonymous with money changing, usury and materialism, so in addition to not being “true” Germans in their eyes, they were also despised by Hitler in particular for not being truly patriotic, allegedly not helping sufficiently in supporting or financing Germany’s WW1 war efforts2, and they were also the prime representative (due largely to their aptitude for success in business) of the capitalist ethos that the Nazis strove to eliminate.

From Wiki:

Before Austria became a republic, the NSDAP (Nationalsozialistische Deutsche Arbeiterpartei), proclaimed this program in May 1918: 

“…the German National Socialist Workers’ Party is not a party exclusively for labourers; it stands for the interests of every decent and honest enterprise. It is a liberal (freiheitlich) and strictly folkic party fighting against all reactionary efforts, clerical, feudal and capitalistic privileges

Prof. J. Salwyn Schapiro’s Definition of Fascism: 

“It would be a great error to regard fascism as a counter-revolutionary movement directed against the Communists, as was that of the reactionaries against the liberals during the first half of the nineteenth century. Fascism is something unique in modern history, in that it is a revolutionary movement of the middle class directed, on the one hand, against the great banks and Big Business and, on the other hand, against the revolutionary demands of the working class. It repudiates Democracy as a political system in which the bankers, capitalists, and socialists find free scope for their activities, and it favours a dictatorship that will eliminate these elements from the life of the nation. Fascism proclaims a body of doctrines that are not entirely new; there are no “revelations” in history.” 

Konrad Heiden: (“Hitler: A Biography”)

“He (Adolf Hitler) wants, once and forever, to do away with the old ruling caste; with petrified legitimists, and hollow dignitaries in gold-braided uniforms“- therefore Hitler was not himself a reactionary, as those “petrified legitimists and hollow dignitaries” are the very reactionaries he despises.

Socialism is an ancient Aryan, Germanic institution. Our German ancestors held certain lands in common. They cultivated the idea of the common weal. Marxism has no right to disguise itself as socialism. Socialism, unlike Marxism, does not repudiate private property. Unlike Marxism, it involves no negation of personality, and unlike Marxism, it is patriotic.

“We might have called ourselves the Liberal Party. We chose to call ourselves the National Socialists. We are not internationalists. Our socialism is national. We demand the fulfilment of the just claims of the productive classes by the state on the basis of race solidarity. To us state and race are one.”

The Revolutionary Zeal of the Nazis:

Adolf Hitler himself wanted to call his party “The Social Revolutionary Party”.

In “Der Fuehrer”, Konrad Heiden, the first biographer of Hitler and the National Socialist movement writes: 

Rohm coined the slogan that there must be a ‘second revolution’, this time, not against the Left, but against the Right; in his diary, Goebbels agreed with him. On April 18, he maintained that this second revolution was being discussed ‘everywhere among the people’; in reality, he said, this only meant that the first one was not yet ended. ‘Now we shall soon have to settle with the reaction. The revolution must nowhere call a halt’.” 

He goes on to describe Nazism as ” A youth creating for itself a new state. A new species of man.

and  “Hitler seized on it in his own way. He led the uprooted proletarians and the uprooted intellectuals together. And this gives rise to a new man: Neither of the two could exist without the other. Both belong together, and from these two a new man must crystallize–the man of the coming German Reich.

Hermann Rauschning was a German conservative reactionary who briefly joined the Nazi movement before breaking with it. In 1934, he renounced Nazi Party membership and in 1936 emigrated from Germany. He eventually settled in the United States and began openly denouncing Nazism. 

Rauschning is chiefly known for his book “Gespräche mit Hitler” (“Conversations with Hitler”, American title: “Voice of Destruction”, British title: “Hitler Speaks”) in which he claimed to have had many meetings and conversations with Adolf Hitler:

National Socialism is an unquestionably genuine revolutionary movement in the sense of a final achievement on a vaster scale of the “mass rising” dreamed of by Anarchists and Communists.”

It can be seen from the evidence above that Hitler and the Nazis were genuinely revolutionary, and not the least bit reactionary in their outlook, and nor were they in any way conservative by their very nature.

The Symbology of Nazism:

The National Socialist symbology makes the case for it having at least some Left Wing characteristics even further, with its Nazi flag displaying an outer red border/background symbolising universal socialism, with a white circle within it symbolising the German nation (and likely also racial purity), with the swastika emblem which symbolised the Aryan heritage, linking the German peoples back to its mythical roots in the fall of ancient Troy. The swastika symbol itself was also, interestingly, used as a symbol by Soviet Red Army units in the post revolution Russian civil war, prior to its adoption by Adolf Hitler and his Nazi cohorts.

Interestingly, the Nazi flag was actually designed by Adolf Hitler himself, a salient fact that gives one significant insight into his perceptions as to what Nazism precisely was, and what it represented politically and ideologically to him in the symbolism used to define it.

Paul von Hindenburg’s Fatal Error of Judgement:

The elections of September 1930 resulted in the break-up of the Grand Coalition (which included all of the major parties of the left, centre, and centre-right – the SPD, the Catholic Centre Party, the German Democratic Party (DDP), and the German People’s Party (DVP) ), and its subsequent replacement with a minority cabinet. Its leader, Chancellor Heinrich Brüning of the Centre Party, governed through emergency decrees from President Paul von Hindenburg. Governance by decree became the new norm and paved the way for  more authoritarian forms of government. The Nazi Party rose from obscurity to win 18.3 per cent of the vote and 107 parliamentary seats in the 1930 election, becoming the second-largest party in parliament as a consequence.

“He met Adolf Hitler for the first time in October 1931, at a high-level conference in Berlin. Everyone present saw that they took an immediate dislike to each other. Afterwards Paul von Hindenburg in private often disparagingly referred to Hitler as “that Austrian corporal“, “that Bohemian corporal” or sometimes simply as “the corporal”, and also derided Hitler’s Austrian dialect.

For his part, Hitler often labeled Hindenburg as “that old fool” or “that old reactionary“. On 26 January 1933, Hindenburg privately told a group of his friends: “Gentlemen, I hope you will not hold me capable of appointing this Austrian corporal to be Reich Chancellor”. Hindenburg made it clear that he saw himself as the leader of the “national” forces and expected Hitler to follow his lead.

In the 1932 German election, Hitler ran against Hindenburg. Hindenburg had support from various nationalist, monarchist, Catholic, and republican parties, and some Social Democrats. Hitler came in second in both rounds of the election, garnering more than 35 per cent of the vote in the final election. Although he lost to Hindenburg, this election established Adolf Hitler as a strong, and increasingly ominous force in German politics.

Hindenburg’s conservative supporters had decided not to make any personal attacks against Adolf Hitler in the first round of the 1932 elections, although instead they criticised the NSDAP and its ideology.

In the runoff, they portrayed Hitler as a party man whose anti-republican rhetoric disguised the NSDAP’s adherence to the system. They also portrayed Hitler and the Nazis as socialists whose rhetoric against Marxism was a disguise towards their own dislike of private property and free enterprise. They contrasted Hindenburg’s Christian character with the Hitler’s apathy at least, and downright antipathy at worst, towards organised religion.

Source: Jones, Larry Eugene (1997). “Hindenburg and the Conservative Dilemma in the 1932 Presidential Elections”. German Studies Review. 20 (2): 235–259. doi:10.2307/1431947.

The absence of an effective government prompted two influential politicians, Franz von Papen and Alfred Hugenberg, along with several other industrialists and businessmen, to write a letter to Hindenburg. The signers urged Hindenburg to appoint Hitler as leader of a government “independent from parliamentary parties“, which could turn into a movement that would “enrapture millions of people”.

Hindenburg reluctantly agreed to appoint Hitler as Chancellor after two further parliamentary elections—in July and November 1932—had not resulted in the formation of a majority government. Hitler headed a short-lived coalition government formed by the Nazi Party (which had the most seats in the Reichstag) and Hugenberg’s party, the German National People’s Party (DNVP).

Source: Wikipedia

Various strategies were attempted by the Nazi Party’s opponents, conservatives included, to prevent them forming a majority government. Because of the political stalemate, Hitler then asked Paul von Hindenburg to again dissolve the Reichstag, and elections were scheduled for early March. On 27 February 1933, fate intervened when the Reichstag building was set on fire in an act of terrorism. For many decades after the War, historians believed this to have been a false flag event at the instigation of the Nazis themselves, but in recent times it has become generally accepted that the fire was set by a lone Dutch Communist (Marinus van der Lubbe), who committed the act in protest at the rise of Fascism in Germany.

In what can only be described as one of the worst “own goals” in political history, the incident became a key component of the ascendency to an absolute dictatorship of the Nazi regime. In response, Hindenburg signed the Reichstag Fire Decree of 28 February, drafted by the Nazis, which suspended basic rights and allowed detention without trial. The decree was permitted under Article 48 of the Weimar Constitution, which gave the President (Hindenburg) the power to take emergency measures to protect public safety and order. Activities of the German Communist Party (KPD) were suppressed, and some 4,000 KPD members were then arrested under this pretext. 

On election day, 6 March 1933, the Nazi Party’s share of the vote increased to 43.9 per cent, and the party acquired the largest number of seats in parliament. Hitler’s party failed to secure an absolute majority, necessitating another coalition with the DNVP.

To achieve full political control despite not having an absolute majority in parliament, Hitler’s government brought the Ermächtigungsgesetz (Enabling Act) to a vote in the newly elected Reichstag. The Act – officially titled the Gesetz zur Behebung der Not von Volk und Reich (“Law to Remedy the Distress of People and Reich“) – gave Hitler’s cabinet the power to enact laws without the consent of the Reichstag for four years. These laws could (with certain exceptions) deviate from the constitution.

Since it would affect the constitution, the Enabling Act required a two-thirds majority to pass. Leaving nothing to chance, the Nazis used the provisions of the Reichstag Fire Decree to arrest all 81 Communist deputies (in spite of their virulent campaign against the party, the Nazis had allowed the KPD to contest the election) and also to prevent several Social Democrats from attending.

After Hitler verbally promised Centre Party leader Ludwig Kaas that Hindenburg would retain his power of veto, Kaas announced the Centre Party would support the Enabling Act. The Act passed by a vote of 444–94, with all parties except the Social Democrats voting in favour. The Enabling Act, along with the Reichstag Fire Decree, transformed Hitler’s government into a de facto legal dictatorship.

Source: Wikipedia

 The “coalition” between conservative parties and the Nazi’s was a vain attempt by an ageing, ailing and probably dementing Paul von Hindenburg to neutralise Hitler’s rise, after powerful supporters deserted him in his narrow victory in the 1932 election. Hindenburg was naive in the extreme in thinking that he could contain Hitler, who seduced “the old fool”, along with Conservative Hugenberg and Centre Party leader Kaas, by promoting a “people’s community” (Volksgemeinschaft ) to break down elitism and unite people across class divides to achieve a national purpose.

Those actions and statements above do not make a compelling case for a voluntary coalition between the NSDAP and various conservative parties, but instead one can see it as a naive attempt by those conservative parties to control the looming Nazi threat. Paul von Hindenburg had no idea just who he was climbing into bed with, and probably thought his status as a WW1 hero would allow him to marshal popular support to remain in the political driver’s seat, even as his age and memory were clearly failing him. A fatal error of judgement, and a gross underestimation of the adversary he and his fellow conservatives had sought to negate. Hindenburg may have been a brilliant battlefield tactician, but he was completely outflanked and outmanoeuvred by the Austrian corporal and his Nazi cohorts.

Having effectively neutralised “the Right” through an ailing 84 year old Paul von Hindenburg, the Nazis set their sights on their only likely opposition, the Communists and other Left leaning socialists. That they shared some (but not all) of their ideological underpinnings was largely irrelevant to the reasons for targeting this group, which instead related to the perceived level of threat they represented to the acquisition of absolute power, a task that was largely enabled through the actions of one lone Communist in setting fire to the Reichstag at such a crucial time in German history. 

Having achieved full control over the legislative and executive branches of government, Hitler and his allies began to suppress the remaining opposition. The Social Democratic Party was banned and its assets seized. While many trade union delegates were in Berlin for May Day activities, SA stormtroopers occupied union offices around the country. On 2 May 1933, all trade unions were forced to dissolve and their leaders were arrested. Some were sent to concentration camps. The German Labour Front was formed as an umbrella organisation to represent all workers, administrators, and company owners, thus reflecting the concept of Nazism in the spirit of Hitler’s Volksgemeinschaft (“people’s community”).

On 2 August 1934, Paul von Hindenburg died. The previous day, the cabinet had enacted the “Law Concerning the Highest State Office of the Reich”. This law stated that upon Hindenburg’s death, the office of President would be abolished and its powers merged with those of the Chancellor.

Adolf Hitler thus became head of state as well as head of government, and was formally named as Führer und Reichskanzler (leader and Chancellor), although Reichskanzler was eventually quietly dropped. With this action, Hitler eliminated the last legal remedy by which he could be removed from office.

Source: Shirer, William L. (1960). The Rise and Fall of the Third Reich. New York: Simon & Schuster. ISBN 978-0-671-62420-0.

Hitler’s Relationship with Big Business:

Based upon writings from contemporary sources, the early links between big business and Adolf Hitler’s National Socialists were sporadic at best.

The early growth of the NSDAP took place without any significant aid from the circles of large scale enterprise. It was only after the Nazis electoral breakthrough in 1930, which was achieved without any significant aid from Big Business interests, that the party began to garner any significant support from the captains of industry. 

Most contributions that were made were in the form of political “hedging”, rather than a sign of explicit form of support for Nazi policies. Furthermore, the few sizeable contributions that appear to have reached the Nazis from Big Business shrink in significance when compared to the amounts that went to the opponents of Nazism. 

In light of Nazism sweeping through German society like an elemental force without Big Business support, industrialists clearly did not play a significant role in Hitler’s rise to power. 

As a knowledgeable and perceptive observer, Joseph Schumpeter, commented, “The attitudes of capitalist groups toward the policy of their nations are predominantly adaptive rather than causative, today more than ever. Rather than shaping events, even the mightiest businessmen merely responded to events shaped by others.

Nazism nominally and seemingly preserves private ownership of the means of production, and keeps the appearance of ordinary markets, prices, wages, and interest rates. 

These are, however, no longer entrepreneurs, but mere shop managers (Betriebsführer in the terminology of the Nazi legislation). These shop managers are instrumental in the conduct of the enterprises entrusted to them; they buy and sell, hire and discharge workers and remunerate their services, contract debts and pay interest and amortization

In all their activities they are bound to obey unconditionally the orders issued by the Nazi government’s supreme office of production management. This office (the Reichswirtschaftsministerium in Nazi Germany) tells the shop managers what and how to produce, at what prices and from whom to buy, at what prices and to whom to sell. It assigns every worker to his job and fixes his wages. It also decrees to whom and on what terms the capitalists must entrust their funds. Any market exchange therefore is merely a sham. 

Hardly an example of free market capitalism, in any way shape or form, but in its stead a gradual “Bolshevisation” of industry was being established behind a facade of autonomy, where the Nazi party controlled every facet of these industries.  

Inevitably, any profits in vast majority made their way directly or indirectly into Nazi coffers for the war effort, and had they won the War then all of these industries and businesses would have been eventually subsumed into the Party apparatus controlling every aspect of its activities for the state. The natural inclinations of the Nazis were entirely geared toward assuming total government control over the means of production.

The NSDAP brand of socialism envisioned the control of the means of production, distribution and exchange, but necessarily not the ownership. Dividends were generally limited to 6% of profits, the rest had to be reinvested.

Production they effectively controlled through the control of raw materials and labour; and exchange through a variety of methods. Outright ownership of companies didn’t start until 1936 and the creation of the Reichswerke Hermann Göring. It became the largest industrial combine in the world, and was 100% state owned. The SS itself also had vast industrial holdings under its full control.

“To put it quite clearly: we have an economic programme. Point No. 13 in that programme demands the nationalisation of all public companies, in other words socialisation, or what is known here as socialism. … the basic principle of my Party’s economic programme should be made perfectly clear and that is the principle of authority… the good of the community takes priority over that of the individual. But the State should retain control; every owner should feel himself to be an agent of the State; it is his duty not to misuse his possessions to the detriment of the State or the interests of his fellow countrymen. That is the overriding point.”

“Socialism as the final concept of duty, the ethical duty of work, not just for oneself but also for one’s fellow man’s sake, and above all the principle: Common good before own good, a struggle against all parasitism and especially against easy and unearned income. And we were aware that in this fight we can rely on no one but our own people.”

The Night of the Long Knives Was Not Ideologically Motivated:

The Night of the Long Knives was a purge that took place from June 30th to July 2nd 1934. Then Chancellor Adolf Hitler, urged on by Hermann Göring and Heinrich Himmler, ordered a series of political extrajudicial executions intended to consolidate his power, and alleviate the concerns of the German military about the role of Ernst Röhm and the Sturmabteilung (SA), the Nazis’ paramilitary organization, known colloquially as “Brownshirts”. Nazi propaganda at the time presented the murders as a preventive measure against an alleged imminent coup by the SA under Röhm – the so-called Röhm Putsch.

The majority of the killings were carried out by the Schutzstaffel (SS) paramilitary force, under the direction of Himmler, and by its Sicherheitsdienst (SD) Security Service, and Gestapo (secret police) under Reinhard Heydrich. Hermann Göring’s personal police battalion also took part in the killings.

Amongst the main victims of this purge, aside from Röhm, were leading members of the leftist-leaning Strasserist faction, including its leader Gregor Strasser, leading some to speculate that this purge of Leftist elements was indicative of an ideological divide within the Nazi Party, and that indicated Hitler’s more Right of centre inclinations.

This assertion completely ignores that amongst the large number murdered by Hitler and his henchmen were many establishment Right wing conservatives such as former Chancellor Kurt von Schleicher and Bavarian politician Gustav Ritter von Kahr, who had helped suppress Hitler’s Munich Beer Hall Putsch in 1923. Thus, the fact that Strasser was the most Left leaning of the Nazis has since come to be misrepresented as being from an ideological causation: when it was much more accurate to describe it as a clean sweep of all Hitler’s foes, including those on the political Right and conventional conservatives.

Hitler was by this stage purely interested in the consolidation of power, and in removing any potential or perceived threats, in this instance by “liquidating” anyone who stood between his close knit faction and total control. As the Italian Mafia would be want to say in such circumstances- “It’s not personal, just business!”. The claim that Strasser’s death was ideological, in the context of the legion of other victims from various factions, and of differing ideologies, who were murdered in this bloody purge, can be seen to be completely unsustainable.

The Difference Between Italian Fascism and its German National Socialist Brethren:

The main difference between National Socialism and Italian Fascism is in the type of Nationalist thought that is prevalent in each ideology.

Fascism, in its original conception, was not a racist, or racialist ideology. It is based on cultural Nationalism, and to a lesser extent civic Nationalism—it is about ideas. When Benito Mussolini came to power in Italy, anyone who was an Italian citizen could be a Fascist. In the 1920s and early 1930s, you only had to believe in Italian culture, and in the Fascist ideology. He didn’t care about your racial and/or ethnic heritage, or your religious beliefs, or much of anything else—as long as you believed in the potential for greatness in the Italian Nation and that Fascism was the way to get Italy there.

National Socialism is a form of Fascism that is very different in one key component—it includes a strong element of ethno-Nationalism. Adolf Hitler in Mein Kampf laid out the ethnic component of his ideology, the “folkish state”, which was a concept that was totally alien to doctrinaire Fascism.

It should be noted that after Mussolini fell under Hitler’s spell, he became a racial Nationalist himself and the difference between the two ideologies narrowed greatly. Even today, this is a point of contention between a lot of Fascists—some of whom follow the original concepts of Fascism, and others who adopt some or all of the ideas of National Socialism.

The two ideologies share authoritarian top-down governance, very strong militarism, a belief that war strengthens the Nation and its people, and the concept of dirigisme, a state-directed planned economy in which private business and enterprise is allowed, but which is tightly regulated by the state. Nazi Germany even went so far as to adopt “Four-Year Plans”, along similar lines to the USSR’s “Five-Year Plans,” even though the official NSDAP line condemned everything about Communism as “Jewish” and anathema.

The Occult and Satanic Roots of National Socialism:

Heinrich Himmler was Reichsführer of the Schutzstaffel (Protection Squadron; SS), and a leading member of the Nazi Party of Germany. Himmler was one of the most powerful men in Nazi Germany and a main architect of the Holocaust. He was also an occultist and an avowed satanist. He had a deep interest in Eastern religions which he believed were intrinsic to Aryan mythology and history. He saw Christianity as an off shoot of Judaism, a result of Jewish subterfuge to confuse and enslave the gentiles.

He promoted a cult of ancestor worship, particularly among members of the SS, as a way to keep the race pure and provide immortality to the nation. Viewing the SS as an “order” along the lines of the Teutonic Knights, he had them take over the Church of the Teutonic Order in Vienna in 1939. He began the process of replacing Christianity with a new moral code that rejected humanitarianism and challenged the Christian concept of marriage

All regalia and uniforms of Nazi Germany, particularly those of the SS, used symbolism in their designs. The stylised lightning bolt logo of the SS was chosen in 1932. The logo is a pair of runes from a set of 18 Armanen runes created by Guido von List in 1906. The ancient Sowilō rune originally symbolised the sun, but was renamed “Sieg” (victory) in List’s iconography.

Himmler modified a variety of existing customs to emphasise the elitism and central role of the SS; an SS naming ceremony was to replace baptism, marriage ceremonies were to be altered, a separate SS funeral ceremony was to be held in addition to Christian ceremonies, and SS-centric celebrations of the summer and winter solstices were instituted, with his base at Wewelsburg Castle being central to the pagan and satanic rituals undertaken routinely by the SS inner circle over many years.  

The Totenkopf (death’s head) symbol, used by German military units for hundreds of years, had been chosen for the SS by Julius Schreck. Himmler placed particular importance on the death’s-head rings; they were never to be sold, and were to be returned to him upon the death of the owner. He interpreted the death’s-head symbol to mean solidarity to the cause and a commitment unto death.

From Wikipedia:

Heinrich Himmler believed that a major task of the SS should be “acting as the vanguard in overcoming Christianity and restoring a ‘Germanic’ way of living” as part of preparations for the coming conflict between “humans and subhumans“.

Himmler biographer Peter Longerich wrote that, while the Nazi movement as a whole launched itself against Jews and Communists, “by linking de-Christianisation with re-Germanization, Himmler had provided the SS with a goal and purpose all of its own“.

Himmler was vehemently opposed to Christian sexual morality and the “principle of Christian mercy“, both of which he saw as dangerous obstacles to his planned battle with “subhumans”.

In 1937, Himmler declared: “We live in an era of the ultimate conflict with Christianity. It is part of the mission of the SS to give the German people in the next half century the non-Christian ideological foundations on which to lead and shape their lives. This task does not consist solely in overcoming an ideological opponent but must be accompanied at every step by a positive impetus: in this case that means the reconstruction of the German heritage in the widest and most comprehensive sense.”

Nazism had much more to do with Teutonic paganism, Theosophy and occultism than it did to Christianity, which was seen as an obstacle and not an aspiration.

It was in the Thule Society that Hitler met those who would help him take over Germany and wage the Second World War. Rudolf Hess, Heinrich Himmler, Martin Bormann, Dietrich Eckart, Alfred Rosenberg, and Hermann Goering were all said to be members. 

It was these, along with Hitler, who used the Thule Society – and it’s inner sect the Vril Society – to launch and promote the Nazi Party. But even amongst this sinister group, there was an inner core who were even more evil, if that is conceivable.

Bormann was an avowed Satanist…….Bormann, together with Rosenberg and Himmler, wanted to destroy Christianity and replace it with a truly occult religion of their own making. And along with the Thule Society, they created a political party that would try and do just that.” 

None of this occultism and mysticism aligns even remotely with Christian (or even non-denominational) conservative values, or with what might be considered Right Wing or Far-Right ideology or thought of the time. Clearly, Nazism, and the main Nazi protagonists themselves had far a more complex set of beliefs, motivations and perspectives that are poorly described in such simplistic terms.

Conclusion:

To assess just how German National Socialism fits within the rather archaic and at times largely arbitrary Left/Right political spectrum, one must look to the origins of these terms that date back to the French Revolution in the late 18th century.

The terms “Left” and “Right” appeared during the French Revolution of 1789 when members of the National Assembly divided into supporters of the King to the President’s right, and supporters of the revolution to his left. One deputy, the Baron de Gauville, explained: “We began to recognize each other: those who were loyal to religion and the King took up positions to the right of the chair so as to avoid the shouts, oaths, and indecencies that enjoyed free rein in the opposing camp.

When the National Assembly was replaced in 1791 by a Legislative Assembly composed of entirely new members, the divisions continued. “Innovators” sat on the left, “moderates” gathered in the centre, while the “conscientious defenders of the constitution” found themselves sitting on the right, where the defenders of the “Ancien Régime” had previously gathered. 

When the succeeding National Convention met in 1792, the seating arrangement continued, but following the coup d’état of 2 June 1793 and the arrest of the Girondins, the right side of the assembly was deserted and any remaining members who had sat there moved to the centre. However, following the Thermidorian Reaction of 1794, the members of the Far Left were excluded and the method of seating was abolished. The new constitution included rules for the assembly that would “break up the party groups”.

However, following the Restoration in 1814–1815, political clubs were again formed. The majority Ultraroyalists chose to sit on the right. The “Constitutionals” sat in the centre while independents sat on the left. The terms extreme right and extreme left as well as centre-right and centre-left came to be used to describe the nuances of ideology of different sections of the assembly.”

(Source: Wikipedia)

Hitler’s National Socialists, by these characteristics, clearly do not come close to the definition of “Right Wing” as it was originally posited and understood. The Nazis were clearly “innovators” (the nominal Left), proposing to radically transform Germany through revolution into their twisted image of a Utopian, ultranationalist and rigidly authoritarian society.

Hitler’s expansionism was at least in part motivated by desires promoted by this Utopian mindset, which soon thereafter required the acquisition of lands to the east (in western Russia) for Lebensraum (living space), where Hitler envisaged settling Germans as a master race, while deporting most of the ethnic Russians to Siberia and using the remainder as slave labour, for transforming this region into the agricultural and industrial heartland of the New Reich.

Hitler also despised the hereditary monarchy and the established religion of the Ancien Régime, and was clearly not a “conscientious defender of the constitution“, which he saw merely as an obstacle to the radical transformations that he and his Nazi cohorts aimed to undertake.

Therefore, the argument that Hitler’s Nazis were purely “Right Wing” ignores several aspects of their ideology inconsistent with the “Right” as Leftists like to categorise them. The Nazis were anti-conservative, anti-capitalist, revolutionary rather than reactionary, statist and proponents of universal health care and radical environmentalism

Nazism is more accurately described, as stated above, as a syncretic ideology3 one that combines several aspects of both “Right” and “Left” in a “Third Way”- “National Socialism” as opposed to “International Socialism”, exactly as they describe themselves to be in their numerous personal statements and writings. 

Leftists like to suggest that the Nazis must be “Right Wing” because racism was a central component of their ideological underpinnings, as though Leftists somehow, on first principles, cannot be racist. Stalin and Mao and Pol Pot were hideous racists, and clearly creatures of the Left. The early French socialists were also known to be racist and also were ultranationalists, and these tendencies persisted throughout much of the 19th century.

Populism and demagoguery are also clearly not exclusive to either the Right or Left, with many such examples spanning the broad expanse of the political spectrum. There is clearly a mere struck match between the Communism of Joseph Stalin (International Socialist Left) and the Fascism of Adolf Hitler (allegedly Right) as both were charismatic cult leaders of these allegedly “polar opposite” ideologies, with remarkably similar methods and ambitions. Their startling resemblance to one another is clearly a “trick of the light”, according to some political commentators at least.

Those of the nominally “Leftist” political persuasion also ignore that Eugenics and Social Darwinism were central to the Nazi ideology, with each being a pseudo-scientific movement that arose almost exclusively from the Left and was embraced with fervour by the Fabian Society (who would describe themselves as Left Wing Socialists) and by much of England’s and America’s intelligentsia, who were in majority also Left leaning socialists by natural inclination and identified as such.

Eugenics was a set of beliefs and practices that aim to improve the genetic quality of a human population. Historically, Eugenicists have attempted to alter human gene pools by excluding people and groups judged to be inferior or promoting those judged to be superior. The Fabians, like George Bernard Shaw in particular, were virulent Eugenicists, up to and including advocating the lethal chamber for “undesirables”: the elderly, the disabled, the feeble minded, the mentally ill, etc.- ideas with chilling and eerie similarity to the Nazi gas chambers, and their notoriously evil Final Solution.

There are therefore clearly elements of both political “wings” in National Socialism. There is, however, nothing remotely “conservative” about the ideology of Nazism, and although I wouldn’t go so far as to call them Leftists, they have a lot more in common with the Bolsheviks under Stalin than most “Leftists” are willing to acknowledge.

Thus it can readily be seen from all that has preceded in the article above that the simplistic notion that Nazism is exclusively, or even fundamentally “Right Wing” is at the very least a flawed assertion, and it is one that has routinely been accepted as received wisdom by the broadest sections of Western society. This misapprehension depends largely on the fluidity of definitions as to what constitutes Right or Far Right ideology, and the movable feast allows those on the Left of centre an adversary on which to focus their hate, in this instance clearly well deserved, safe in the knowledge that the Nazis could not possibly share any similar beliefs to those they might personally, or collectively hold dear.

Unfortunately, the reality is far more complex and nuanced than this false (albeit comforting) belief can logically sustain.

Footnotes:

1 Once in power, of course, Lenin then abolished these kind of “Soviets”, these collective worker based councils, and replaced them instead with his specially selected, elite cabinet which he called the “Supreme Soviet”. As Orwell noted, “all… are equal, but some are more equal than others”.

2 After WW1, the Treaty of Versailles inflicted severe punishments on Germany, including Article 231 which forced Germany to accept full and complete blame for the War and the damages therefrom.

The humiliation was completed through being forced to surrender 13% of its land, give up all its overseas colonies and pay huge financial reparations which eventually contributed significantly to the ruination of Germany economically.

At the time, many warned that the substance of the Treaty and the harshness of the conditions would make a repeat of the conflict inevitable. Such was the resentment felt by former soldiers such as Hitler, and a significant and growing proportion of the German population, that such sentiment was ripe for exploitation by the Nazis to rally the disaffected to their ultranationalist cause.

Hitler used these circumstances to full effect in fiery speeches condemning those who “stabbed Germany in the back” during, and in the aftermath of WW1, including the Jews who were singled out for specific criticism for their alleged lack of patriotism.

This stab-in-the-back myth was widely believed and promulgated in Germany after 1918. It maintained that the Imperial German Army did not lose World War 1 on the battlefield, but was instead betrayed by certain citizens on the home frontespecially Jews, revolutionary socialists who fomented strikes and labor unrest, and other republican politicians who had overthrown the House of Hohenzollern in the German Revolution of 1918–1919. Advocates of the myth denounced the German government leaders who had signed the Armistice of 11 November 1918 as the “November criminals” (November­verbrecher).

When Adolf Hitler and the Nazi Party rose to power in 1933, they made the conspiracy theory an integral part of their official history of the 1920s, portraying the Weimar Republic as the work of the “November criminals” who had “stabbed the nation in the back” in order to seize power. 

Nazi propaganda depicted Weimar Germany as “a morass of corruption, degeneracy, national humiliation, ruthless persecution of the honest ‘national opposition’—fourteen years of rule by Jews, Marxists, and ‘cultural Bolsheviks’ who had at last been swept away by the National Socialist movement under Hitler and the victory of the ‘national revolution’ of 1933″.

Source: Kolb, Eberhard (2005). The Weimar Republic. New York: Routledge. p. 140. ISBN 0415344425.

3 Syncretic ideology is one which combines or brings together different philosophical, religious, or cultural principles and practices.

Of Human Bondage- Slavery, a Blight on the Human Condition:

The enslavement of other human beings has been a near universal aspect of the history of man since the dawn of civilisation, affecting every race, gender and age group. Slavery is, and has always been a blight on the human condition, where those with the power to do so have routinely subjugated their fellow man into the often brutal bonds of servitude, with removal of their right to human dignity, freedom of movement, legal rights and personal self-determination and autonomy. Historically, there are many different types of slavery including, but not limited to forced labour, and chattel, bonded and sexual slavery.

Slavery has existed since records began, with the earliest known slave societies being the Sumerian and other Mesopotamian civilisations, located in the Iran/Iraq region between 6000 to 2000BCE. The oldest known written reference to institutionalised slavery is found in the Hammurabi Code (c.1754 BCE) which states, among other things, “If anyone take a male or female slave of the court, or a male or female slave of a freed man, outside the city gates, he shall be put to death.”

Slavery in Hunter-Gatherer Societies:

There is a general consensus amongst anthropologists that slavery was not a prominent feature of hunter-gatherer societies, as they assert that mass slavery requires what they refer to as “economic surpluses” (i.e. an excess of food or resources) and a degree of population density to be a viable option. A population group needs substantial resource surpluses to house, feed, clothe and maintain captive slaves for any length of time. The mutual exchange and sharing of resources (i.e. meat gained from hunting) were also suggested to be an important factor in the maintenance of the economic systems of hunter-gatherer societies, which can best be described as based on a “gift economy”. It remains a matter of conjecture, however, just how “egalitarian” this sharing of resources may have been in reality, with some studies suggesting that the hunter males consumed >80-85% of the calories of the meat they obtained, leaving only a small proportion of meat for the women and children, who instead foraged for berries and other edible produce rather than hunting for game, with often scant reward for their labours, particularly in times of scarcity.

These hunter-gatherer societies generally consisted of small groups of 20 to 50 members, not all of whom are bound (as had been previously assumed) by family connection, although numbers could swell seasonally under shared common goals when game was plentiful, or when some form of collaboration with neighbouring tribes was otherwise mutually beneficial. Maintaining an extra mouth or two to feed, particularly in times of resource scarcity due to drought, tribal conflict or a shortage of game, was clearly impractical in many if not most circumstances.

Many hunter gatherer indigenous societies were known to actively cull their own unwanted offspring, as well as the sick or infirm if they could not pull their weight in performing the tasks required of them to maintain the group. Similarly, many hunter-gatherer societies routinely practiced cannibalism, and were therefore unlikely to keep any prisoners for any length of time that they might derive from any tribal conflicts, for slavery purposes at least, on that basis alone.

That being said, prior to the establishment of the first “civilised” societies in the ancient Near East, there is no direct or written evidence to back up or refute this otherwise reasonable anthropological assumption. The only evidence we have that might in part call into question this assumption are more modern examples of hunter gatherer societies that have lingered into the modern era. Areas where hunter/gatherers had large resource surpluses, such as the First Nations Haida Gwaii island people in the Canadian Pacific Northwest, are known to have had slaves due to the large amount of local marine resources at their disposal, with huge salmon runs and an extensive network of trading relationships. The Gwaii didn’t use slaves for agrarian purposes, but instead used them for labour in construction of forts, tools, weapons and boats.

A similar resource surplus is likely to have been present in some indigenous island cultures, and in the more fertile regions of Africa and Asia, making it difficult to rule out the existence of some degree of slavery being a feature of these cultures prior to the Neolithic epoch. Slavery is known to have existed for centuries, if not millennia, for example, throughout the entirety of the African continent prior to the arrival of Europeans, the only question being the extent of its application, and just how far back into antiquity it first arose.

Slavery in the Ancient World:

Evidence of slavery, as stated previously, predates any written records; and this practice has existed in most, if not all major cultures at one time or another. Mass slavery, however, seems to have only proliferated after the invention of agriculture during the Neolithic Revolution, about 11,000 years ago, and has flourished in one form or another ever since, to our eternal shame.

As the world became more civilised and then organised in cities, and when farming was becoming more established, ironically slavery became more and more prevalent, usually as a consequence of such issues as poverty and debt repayment, through birth into a pre-existing slave family, or via child abandonment, war or tribal conflict, or as a punishment for various crimes. This definition of slavery doesn’t even include the vast majority of people who were consigned to the peasant classes in feudal societies throughout the ancient and medieval world, whose lives were often little better than those of the indentured slaves, and who were often viewed as a mere resource and “looked after” accordingly.

The inhuman practice of slavery in various Mesopotamian cultures (Sumerian, Akkadian, Assyrian and Babylonian) was so entrenched and deeply rooted in this “cradle of civilisation” that it is difficult to determine how far back in antiquity it all began. The earliest mentions of slavery comes from Sumerian cuneiform writings, and the codes of practice for institutionalised slavery were enshrined in the Hummarabi Code around 1754 BCE, which outlines the rights of slaves, which were notably limited, and the regulations around slave ownership, which nonetheless still included harsh and brutal treatment for any indiscretion, disobedience or insolence.

In the various Mesopotamian societies, slavery was integral to the labour force, particularly in agriculture, but also utilised within the various labour intensive trades to be found within their societies. Because of the substantial number of slaves within these societies, there evolved the practice of marking or branding slaves with either knives or hot irons to establish ownership, and the category of slave to which they belonged. Men were subjected generally to hard physical labour, whilst women and children were generally put to household tasks, although the females were often subject to sexual exploitation and harassment down the ages.

In the ancient worlds of the Greek City States and the Roman Republic and Empire, slavery was a driving force for the economy in order to maintain, and then promote the prosperity of their societies. The slaves who were being sold could be easily found on the docks at any port, or in the markets in cities under Roman control. Meanwhile, in ancient Athens, almost everyone owned a slave, not merely those who would constitute the leisure (upper) class.

The Bronze Age Minoan civilisation, centred on the island of Crete and surrounding islands in the Aegean Sea, and flourishing from around 3500BCE until it’s abrupt decline in about 1400 BCE, represents the first advanced civilisation in Europe, leaving behind a number of massive building complexes, sophisticated art, and writing systems. Its economy benefited from its network of trade around much of the Mediterranean.

Although written records are scant, slavery clearly played an important role in Minoan society also, since slaves were used in building their palaces, temples and other elaborate construction works, and they were also engaged in the crucial area of agriculture. Minoan society was highly stratified, and experts reflect that there were three distinct lower classes: there were free citizens, and then two classes of serfs, one with some rights, though not the right to possess arms, and the other with no rights and these were the chattel slaves (where a slave is considered to be personal property). Aristotle made a passing reference to the Laws of (King) Minos as still being in force among the Cretan serfs during his time. Certainly some slaves at Knossos were bought and sold; and Minoan ships, being rowed as well as sailed from the evidence of frescoes; we also know from the tablets of Pylos that as many as 600 rowers were required for a Minoan fleet to be deployed at sea, manned by what were almost certainly galley slaves.

The recorded history of slavery in mainland Ancient Greece, on the other hand, begins during the Mycenaean civilization (1600 to 1100 BCE), as indicated in numerous tablets unearthed at Pylos. Slavery was an accepted practice in classical Ancient Greece, and took the form of chattel slavery, in addition to the helots (Sparta) and penestae (Thessaly) who were land-bonded slaves probably more akin to medieval serfs. The principal use of these slaves was in agriculture, but they were also used in stone quarries or mines, and as domestic servantsAthens had the largest slave population of the Greek city states, with upwards of 80,000 at the height in the 5th and 6th centuries BCE, with an average of three or four slaves being kept per household, except in the poorest families.

Roman civilisation dates from the founding of the city of Rome in the 8th century BCE, and over the next 1000 years acquired an Empire that took in much of Europe, and incorporated the lands and peoples surrounding the Mediterranean Sea, until the collapse of the Western Roman Empire in the 5th century CE. At its peak, it was populated by 90 million people (~20% of the global population) and covered an area of 5 million sq.km.

Slavery in Ancient Rome played an important role in society and was pivotal to their economy. Besides being exploited for manual labour, slaves performed many domestic tasks, and might even be employed in several highly skilled jobs and professions. For example, accountants and physicians were often derived from the slave class. Slaves of Greek origin in particular were often highly educated and valued accordingly. Unskilled slaves, or those sentenced to slavery as punishment, usually worked on farms, in mines, and at mills. 

Slaves were considered property under Roman law and had no legal personhood. Unlike Roman citizens, they could be routinely subject to corporal punishment, sexual exploitation (Roman prostitutes were often slaves), torture and summary execution. Over time, however, slaves gained increasing legal protection, including the right to file complaints against their masters for mistreatment.

One major source of slaves was derived from the extensive Roman military expansion during the Republic. The use of former enemy soldiers as slaves led perhaps inevitably to a series of rebellions, the Servile Wars, the last of which was led by the Thracian gladiator and slave leader Spartacus. By the end of the first century CE, Italy had upwards of 1 to 2 million slaves, between 20-30% of the population, whilst the entirety of the Western Roman Empire at its height had 5 million slaves. Slaves were drawn from all over Europe and the Mediterranean, including Gaul, Hispania, North Africa, Syria, Germany, Britannia, the Balkans, and Greece.

The Romans differed from Greek city-states in allowing freed slaves to potentially become citizens. After manumission, a male slave who had belonged to a Roman citizen enjoyed not only passive freedom from ownership, but active political freedom (libertas), including the right to vote. A slave who had acquired libertas was thus a libertus (“freed person”) in relation to his former master, who then became his patron (patronus), and he or she enjoyed many of the rights and privileges of common citizens, except that they were not entitled to hold public office or state priesthoods, nor could they achieve senatorial rank. Any future children of such a freedman would be born free, and therefore bestowed with the full rights of citizenship.

Slavery in the Dark, Middle and High Middle Ages:

From 500CE to 1500CE, the practice of slavery expanded through war and conquest. Whether it be the Carolingians, or the Vikings, or Islamic invaders, all societies engaged to a greater or lesser extent in the commoditisation of human beings conquered in times of war. Islamic invasions of India took hundreds of thousands of slaves from Peshawar, in addition to those millions taken by them from sub-Saharan Africa. The Tang Dynasty Chinese also took countless slaves in raids on Korea, Turkey, Persia and Indonesia, as well as thousands of slaves taken from indigenous Aboriginal tribes.

On the Indian subcontinent, slavery was likely an established institution in ancient India by the start of the common era based on writings found in such texts such as the Arthashastra, the Manusmriti and the Mahabharata. Slavery was “likely widespread by the lifetime of the Buddha and perhaps even as far back as the Vedic period”.

The Manusmriti has been much criticized by low castes (Dalits or Shudras ) in India as it institutionalized a system of slavery in Hindu society. The low caste people were relegated to less than animals without any rights. They were called untouchables (achoot) and supposed to be so unholy that “even their shadow was considered to be polluting”.

Slavery was brought during the medieval era with the Muslim conquests of the Indian subcontinent. In the 8th and 9th Century, Islamic invasions under Muhammad al-Qasim took over 100,000 slaves back to Bilad Al-Sham (Syria) and Iraq. During the Delhi Sultanate period (1206–1555), references to the abundant availability of low-priced Indian slaves abound. Many of these Indian slaves were used by Muslim nobility in the subcontinent, but others were exported to satisfy the demand in international markets. Some slaves were then subsequently forcibly converted to Islam. 

Slavery and empire-formation tied in particularly well with iqta (tax farming) and it is within this context of Islamic expansion that elite slavery was later commonly found. It became the predominant system in North India in the thirteenth century and retained considerable importance in the fourteenth century. Slavery was still vigorous in fifteenth-century Bengal, while after that date it shifted to the Deccan where it persisted until the seventeenth century. It remained present to a minor extent in the Mughal provinces throughout the seventeenth century and had a notable revival under the Afghans in North India again in the eighteenth century.

In pre-modern China, slavery was prevalent to a greater or lesser degree throughout its history. The Tang Dynasty saw the purchase of large numbers of Western slaves from the Radanite Jews. Tang Chinese soldiers and pirates enslaved Koreans, Turks, Persians, Indonesians, and people from Inner Mongolia, central Asia, and northern India. The greatest source of slaves came from southern tribes, including Thais and aboriginals from the southern provinces, and from as far afield as Africa via the Silk Road trade routes. During the Yuan Dynasty, it was the Han Chinese themselves who particularly became enslaved in large numbers, subjugated under the occupying rule imposed after the Mongol invasions.

Although some modest steps were taken to minimise slavery in the Song and Ming dynasties that followed, the practice continued unabated throughout these eras, particularly for war captives and ethnic minorities, but households were limited in the number of slaves permitted.

Throughout Central Asia and the Caucasus, slavery was integral to the multitude of ethnically diverse societies, confederations, city states and empires that ebbed and flowed across the centuries, but which had formed in the interim a complex network of slave trading facilitating the sale and movement of commoditised human beings the length and breadth of the Silk Road.

In Indochina, an over thousand year Chinese domination of Vietnam led to an unending stream of Vietnamese girls taken as sex slaves, whilst the Khmer Empire had a very large slave class that built monuments at Angkor Wat from the 9th to 15th Century. Meanwhile, in Siam (Thailand) slavery was commonplace for war captives and indigenous tribes up until the early 20th Century, while Indonesia and the Phillipines had underclasses of slaves who were similarly exploited over many centuries.

In the British Isles, slavery predated the Roman occupation from 45CE to 410CE, where in pre-Roman times native Britons were enslaved in large numbers, typically by rich merchants and warlords who exported these indigenous slaves to the continent and points beyond.  Following the Roman conquest of Britain, slavery was substantially expanded and industrialised, in line with their established Roman traditions. After the fall of Roman Britain, the Angles and the Saxons perpetuated and propagated the slave system further.

From the first known attack by Viking raiders in England at Lindisfarne was in 793CE, Vikings traded with GaelicPictBrythonic and Saxon kingdoms in between raiding them for slaves. Saxon slave traders sometimes also worked in league with Norse traders, often selling native Britons to the Irish in the Dublin slave markets. Even though slavery fell generally out of favour with the Anglo-Saxons, by 1086, according to the Domesday Book census, over 10% of England’s population within it were recorded as being slaves.

With the Norman Conquest of Britain in 1066, William the Conqueror passed laws preventing the sale of slaves overseas. Scottish and Welsh raiders took slaves, but by the end of the 12th Century, slavery was largely replaced by a feudal society of serfs, villeins and bondsmen, who differed only marginally from slaves in not being a movable commodity, but remained an exploitable resource nonetheless.

In the remainder of Europe, prior to the transition to feudal societies based on serfdom, slavery was widespread, and grew more so in the wake of the social chaos caused by the barbarian invasions of the Western Roman Empire. Europe and the Mediterranean world were part of a highly interconnected network of slave trading. Demand from the Islamic world dominated the slave trade in medieval Europe, with Italian traders in Venice from the mid 8th Century in particular providing a hub for the selling of Slavs and other Eastern European non-Christian slaves in great numbers, many of whom had been transported in caravans from as far afield as Kievan Rus and the Baltic states, through the Eastern European Slavic states, to be sold to buyers from Muslim Spain, North Africa, and the Middle East. In the 9th and 10th Century, Amalfi became a more prominent slave hub, while the Genoese came to become more dominant from the 12th Century.

Umayyad Spain, centred at Cordoba in Al-Andalus similarly imported an enormous number of non-Muslim slaves, as well as serving as a staging point for Muslim and Jewish merchants to market slaves to the rest of the Islamic world, in particular as a source of mamelukes (slave soldiers) for distribution to Egypt and throughout the Islamic world. Slavery similarly existed in the Christian controlled regions of the Iberian peninsula, where it was prevalent initially under the Roman occupation, and then continued under the Visigoths. From the fifth to the early 8th century, large portions of the Iberian Peninsula were ruled by Christian Visigothic Kingdoms, whose rulers worked to codify human bondage. Slavery remained persistent in Christian Iberia after the Umayyad invasions in the 8th century, and the Visigothic law codes continued to control slave ownership, although slaves were by then a very small proportion of their population.

The Mongol invasions and conquests in the 13th century added a new force in the slave trade. The Mongols enslaved skilled individuals, women and children and marched them to Karakorum or Sarai, whence they were sold throughout Eurasia. Many of these slaves were shipped to the slave market in Novgorod. Genoese and Venetians merchants in Crimea were involved in the slave trade with the Golden Horde.

In 1382, the amalgam of Mongols and Turkic tatars that comprised the Golden Horde sacked Moscow, burning the city and carrying off thousands of inhabitants as slaves. For years the Khanates of Kazan and Astrakhan routinely made raids on other Russian principalities for slaves and to plunder towns. Russian chronicles record about 40 raids of Kazan Khans on the Russian territories in the first half of the 16th century. In 1521, the combined forces of Crimean Khanate and their Kazan allies (who both succeeded the Golden Horde on its demise) attacked Moscow and captured many thousands of slaves. About 30 major Tatar raids were recorded into Muscovite territories between 1558 and 1596 alone. In 1571, the Crimean Tatars attacked and sacked Moscow, burning everything but the Kremlin, and once again they took many thousands of captives as slaves. In the Crimean Khanate at that time about 75% of the entire population consisted of slaves. In total, during the three centuries of Tartar raiding, about three million Slavic peasants were taken from Russia, Poland and the Ukraine, and then sold to the Ottomans.

By the 16th century, slavery in Russia itself consisted mostly of those who sold themselves into slavery owing to poverty. Slavery remained a major institution in Russia until 1723, when Peter the Great converted the household slaves into house serfs. Russian agricultural slaves had been formally converted into serfs earlier in 1679, establishing the feudal system that would remain in place until the Russian Revolution.

In the later half of the Middle Ages, the expansion of Islamic rule further into the Mediterranean, the Persian Gulf, and Arabian Peninsula established the Saharan-Indian Ocean slave trade. This network was a large market for African slaves, transporting approximately four million African slaves, by conservative estimate, from its 7th century inception to its 20th century demise. There are references to gangs of slaves, mostly African, put to work in drainage projects in Iraq, salt and gold mines in the Sahara, and sugar and cotton plantations in North Africa and Spain. Eunuchs were the most prized and sought-after type of slave.

Slavery was an important part of Ottoman society. The Byzantine-Ottoman wars and the Ottoman wars in Europe brought large numbers of Christian slaves into the Ottoman Empire, which flourished from 1453 with the conquest of Constantinople, and lasted until the turn of the 20th Century. Because Islamic law forbade Muslims to enslave their fellow Muslims, the Sultan’s concubines were generally of Christian origin (“cariye”). The concubines of the Ottoman Sultan therefore chiefly consisted of purchased slaves.

The Devşrime system groomed young slave boys for civil or military service. Young Christian boys were taken as captives from their conquered villages periodically as a levy, and some would go on to be employed in roles within government, in entertainment, or in the army, depending on their talents. Slaves could attain a measure of success from this program, with some eventually winning the post of Grand Vizier to the Sultan, and others positions within the Janissaries, the elite infantry unit that formed the Sultan’s trusted household guard.

In Mesoamerica, the most common forms of slavery were those of prisoners of war and debtors. People unable to pay back debts could be sentenced to work as slaves to the persons owed until the debts were worked off. The Mayan and Aztec civilizations both practiced this form of slavery and, although there is scant direct written evidence one way or the other, it was also likely a facet of the progenitor Olmec society that preceded them, particularly in the quarrying and then moving the giant stone heads that characterised their culture.

Warfare was an important aspect of Mayan society, because raids on surrounding areas provided the victims required for human sacrifice, as well as the slaves needed for the construction of temples. Most victims of human sacrifice were prisoners of war or slaves. Slavery in the Aztec Empire and surrounding Mexica societies was widespread, with slaves known by the Nahuatl word, “tlacotin”. Slaves did not inherit their status; people being enslaved as a form of punishment, or after capture in war, or voluntarily in order to pay off debts.

Many of the indigenous peoples of the Pacific Northwest Coast, such as the Haida and Tlingit, were traditionally known as fierce warriors and slave-traders prior to and subsequent to European contact, raiding as far south as California. Slavery as practiced by them was hereditary, with the slaves being chiefly prisoners of war, with up to 25% of the tribe possibly being enslaved. Other prominent slave-owning societies and tribes of the New World included the Tehuelche of Patagonia, the Kalinago of Dominica, the Tupinambá of Brazil, and the Pawnee of the Great Plains.

Slavery in African History:

Like most other regions of the world, slavery and forced labour existed in many of the kingdoms and societies of Africa for many hundreds of years, prior to European contact. Slavery in northern Africa dates back to ancient Egypt. The New Kingdom (1558–1080 BCE) brought in large numbers of slaves as prisoners of war up the Nile valley and used them for domestic and supervised labour. The earliest written account to suggest mass slavery was used in constructing the pyramids was from the ancient Greek historian, Herodotus, writing in the 5th Century BCE, who specifies that the pyramids were built with slave labor – 100,000 slaves, to be specific – although interestingly he makes no mention of the Israelites as being amongst those used for slavery at this time.

Recent archeological evidence claiming to debunk the use of slaves in building the pyramids lacks significantly in credibility, and sadly speaks more to the politicised nature of Egyptology from parochial archeologists with a vested interest in downplaying the slavery narrative that more contemporary sources confirm.

Chattel slavery had been legal and widespread throughout North Africa when the region was controlled by the Roman Empire (145 BCE to ca. 430 CE), and by the Eastern Romans from 533 to 695 CE. A slave trade bringing Saharans through the desert to North Africa, which existed in Roman times, continued for centuries thereafter.

After the Islamic expansion into most of the region because of the trade expansion across the Sahara, the practices continued and eventually, the assimilative form of slavery spread to major societies on the southern end of the Sahara (such as the MaliSonghai, and Ghanaian Empires). The following excerpt for an online source makes a strong case worthy of consideration:

Between the 4th and early 16th centuries AD, through a succession of kingdoms that included Wagadou (Ghana), Mali, and Songhai, the West African Sahel was among the wealthiest regions on earth during a period when most of Europe wallowed in medieval feudalism.  Prior to the discovery of the Americas, West Africa was the world’s largest source of gold – so much gold in fact that when the Malian King Mansa Musa (possibly the wealthiest man in human history) visited Mecca during his 14th century Hajj, his 60,000 strong retinue (including 12,000 slaves) distributed so much gold (each carried 1.8 kg of gold bars) that he crashed its value and created a decade of economic chaos on the Arabian peninsula.

The Niger River during this time possessed six times more arable land than the Nile.  In the adjacent Sahara to the north, Africans operated extensive salt mining operations.  With the arrival of the Arabs in the 8th century AD, a prodigious iron smelting and blacksmithing industries occupied entire villages from one end of the Sahel to the other.  The West African political economy was such that no king ever enforced strict ownership over the entirety of his realm, so after the millet harvest an African peasant could earn good extra income panning for alluvial gold, mining iron ore, harvesting trees to make charcoal fuel for iron smelting, or travelling north to labor in the salt mines.

The Sahel during this period was awash in food and gold and large prosperous cities like Gao grew into architectural wonders.

In the 9th and 10th centuries AD, trade caravans from what are today Morocco and Algeria began regularly making their way south through the Sahara desert during the winter months. These caravans initially brought with them manufactured goods and luxury items to exchange for gold, ivory, specialty woods, animal skins, and salt.  But during the 13th century these caravans started supplying a vital military component to the various competing rulers of the Sahel – Barb horses.  Ownership of horses gave each ruler a cavalry, and ownership of large herds could facilitate military superiority over rivals.

The Malian, Hausa, Mossi, Bornu, Kanem and Songhai cavalries regularly battled each other for over three hundred years. Continuous combat was made possible only by a steady supply of Barb horses from the Maghreb, a market that traders were happy to oblige as the supply of gold from the Sahel appeared endless. These imported horses were expensive and were initially paid for with alluvial gold, which was starting to go into productive decline during the 15th century at about the same time the Songhai king Sonni Ali Ber led a successful campaign to defeat his enemy Mali and consolidate rule over the Sahel from Lake Chad to the Cap-Vert peninsula.  So the height of Songhai power coincided with maximum operating costs to retain that power just as alluvial gold production from the Niger River went into decline.

Saddled with the mounting expense of importing horses to maintain cavalry regiments, the Songhai lords began to launch slave raids upon the various Sahel peoples.  So as the 15th and 16th centuries progressed, slaves rather than gold became more and more the medium of exchange between the Songhai lords and the horse traders of the Maghreb.  As these traders brought more and more slaves to the Mediterranean coast of North Africa, most were purchased by Arabs but many were sold on to Europeans where they were employed as domestic servant in wealthy cities like London and Antwerp and were considered a high status symbol – the “negars and blackmoores” of 16th century Elizabethan England.  So it was not the Europeans that first procured slavery in West Africa, but the Songhai themselves that introduced Europe to African slaves via Arab and Berber intermediaries.  Europeans at this time were a minor end customer, where the primary slave demand was provided by Arabs.

As the 16th century ground out successive years, the gold really began to play out.  Continuous and devastating slave raids depopulated the Niger River goldfield regions – crashing not only gold but also food production – and drove its inhabitants onto marginal lands that had been earlier deforested to manufacture charcoal for the formerly prodigious iron smelting industry.  Over a period of 200 years the once prosperous Sahel was transformed into a land inhabited by subsistence food scavengers and all powerful cavalry lords where the incessant demand for horses laid economic waste to this once prosperous region.

With Songhai power in the late 16th century at its nadir as a result of internecine strife and succession wars among the dead king Askia Daoud’s many sons, the Sultan of Morocco, Ahmad al-Mansur, took advantage of the ensuing political instability and sent a military expedition across the Sahara and in 1591 these 4,000 Moroccans and their cannons defeated the Songhai at the battle of Tondibi.

Thus with the defeat of the powerful Songhai Empire the coast of West Africa south of the Arab stronghold Nouakchott was left wide open to European maritime exploitation.  By 1625 the Dutch had established a permanent settlement at Gorée and the Portuguese likewise at Portudal, both located in modern day Senegal.  These initial European forays onto West African soil provided the vital resupply anchorage that enabled further permanent settlements along the entirety of the Gulf of Guinea and as far south as Namibia.”

As alluded to in the above description, the most comprehensive and articulate one I could find that did not attempt to diminish in any way the gravity of pre-colonial African slavery, it is clear that, like most of the rest of the world, the African continent and its societies were involved in slavery practices that were every bit as inhumane, brutal and rapacious as any other examples of this despicable trade found elsewhere, before or since. Man’s inhumanity to man clearly knows no bounds, nor ethnicity or cultural barrier.

The framework already existed for slave trading across the African continent, and it was therefore ripe for the exploitation by Europeans that followed with the Trans-Atlantic slave trade, which from the 16th to 19th Century transported around 10 to 12 million Africans to the Americas, principally Brazil, the Caribbean and to a lesser extent (numerically) the southern United States.

Slavery existed for millennia throughout the entirety of African continent prior to the arrival of Europeans.  African slaves were captured, worked hard in the millet fields, scolded, beaten, sold multiple times, raped, and murdered well before the first European footprint was impressed on a West African beach. 

Slavery was the natural African social condition, and it continued as Europeans colonised the continent, who commercialised it with ruthless efficiency during the North Atlantic slave trade, and in many places it continues today even after most Europeans have long since left African shores.

Europeans, as a rule, usually bought enslaved people who were captured in the endemic tribal and internecine warfare that prevailed between African tribal kingdoms and empires, as those who were captured from a different ethnic group were considered as “others”, and therefore not thought of as worthy of protection or consideration, an attitude mirrored by their European cohorts.

Thus African slavery may not have been motivated by racial or religious distinctions, but it was certainly motivated on tribal grounds, which in essence amounts to the same enslavement of those “other” than one’s own closely aligned group, where their “otherness” acted as a powerful rationalisation for allowing such inhumanity and often brutality to occur firstly, and then secondly to profit from this trade either economically, or in enhancing their intertribal and social power and status.

Additionally, it requires mention that some examples of African slavery led to the routine and even ritualistic slaughter of those captured slaves, as exemplified most graphically by the Kings of Dahomey (and also to some extent by those in Cameroon) where tens of thousands of slaves were routinely sacrificed ceremonially via beheading, with anywhere from 500 to 4000 captives slaughtered in one day during annual festivals (giving rise to the term “Xwetanu“- which means “yearly head business”).

The following excerpt from the same online article I sourced earlier for the extensive quote above, offers insight into how Europeans gained such extensive access to slaves for the North Atlantic Slave trade, and how the market for such slaves was used to its utmost by the indigenous African Kings and Queens, warlords and leaders as a means of consolidating their power, and promoting the ascendancy and wealth of the people who fell under their tribal hegemony.

Whilst Europeans were clearly the wholesale customer for the slavery product, only to transport them to the New World for distribution to end point customers there, it was also a significant subset of Africans themselves who amassed great wealth and influence in their entrepreneurial procurement of those slaves. That Africans were the origination point for the West African slavery supply chain, where they occupied the roles of contractor, planner, procurer, and transporter to distribution hubs, cannot be denied. Clearly no one comes out of this hideous trade in human suffering with clean hands, or with an unblemished conscience, having either procured, aided and abetted, facilitated or otherwise been the recipient of human beings who were routinely treated as mere objects, to be bought, sold and traded as a commodity for financial and other gain.

The Angolan Model of Contracted Slave Procurement

“The gradual encroachment of European settlements down the Atlantic coast of West Africa did not lead to immediate mass colonisation as malaria and tsetse flies kept out all but the hardiest and most rapacious adventurers.  But how did these Europeans procure so many slaves to service the burgeoning and incredibly profitable sugar and tobacco charters of the Caribbean?  The Kunta Kinte procurement model would have eventually led to depopulation of the local areas as the traditionally semi-mobile Africans would have just up and moved out of reach like they did to avoid the Songhai lords, and Africans were beginning to adopt European weapons in their defense.  So – how did so many Africans end up as slaves in the Americas despite their overwhelming numbers back in Africa?

The answer lies in the Angolan model which was by no means confined to this region alone.  During the first half of the 16th century the Portuguese established a permanent trading station at the port of Soyo, a province within the Kingdom of Kongo on the south bank at the mouth of the Congo River.  The significance of Soyo was it established the first European occupation in West Africa outside the provenance of the tsetse fly, and with trypanosomiasis absent, colonists could settle and import European livestock for the first time on the African Atlantic coast.  Entire families of Portuguese colonists began to arrive and by 1575 the city of Luanda was founded, followed by Benguela in 1587.  With Angola’s drier, more temperate climate, these early European colonists got to the business of building homes, clearing land, farming, fishing, and raising their livestock.  But one thing they did not do was get to the business of travelling hundreds of miles inland to hunt down and capture slaves.  They left that to others – and these others weren’t Europeans.

Soon after the Portuguese planted their flag at Soyo, they granted a trade monopoly to the Kingdom of Kongo which ruled over what is now northwestern Angola.  But as Portugal established colonies to the south of Soyo, these new colonies were located in lands claimed by Kongo but occupied by Ambundu peoples of the N’Dongo and Kisama states within the Kwanza River valley.  Because of the trade monopoly specifics granted to Kongo, the Bakongo could sweep through the Kwanza River valley and capture the local Ambundu and sell them into slavery to the Portuguese, but the Ambundu could not capture these Bakongo raiders and sell them into slavery to the same customer.  This egregious injustice incensed the N’Dongo king to the point of declaring war on – not the Portuguese – but the Bakongo in an attempt to break the discriminatory trade monopoly.  The Ambundu were successful and in 1556 they defeated the Bakongo in a war fought not to end the enslavement of their fellow Africans, but to extend to themselves the right to capture, enslave, and sell their Bakongo neighbors to the Portuguese.

Despite the N’Dongo victory and elimination of Kongo influence in the Kwanza River valley, the Portuguese insisted on upholding their original trade agreement, so the Kongo trade monopoly remained in place with the Ambundu still cut out of all commercial activity with the Portuguese.  Realizing they had prosecuted a war for nothing, the N’Dongo spent the next several decades threatening colonists and harassing Portuguese interests up and down the Kwanza River valley without any penetration into the colonial economy.  In 1590 N’Dongo had had enough of the commercial status quo so it allied itself with its eastern Ambundu neighbor Matamba and together they declared war on all Portuguese interests across Angola.

This war led the Portuguese to construct a network of fortalezas up and down the Angolan coastline and after years of protracted violence the Portugal finally defeated the N’Dongo in 1614.  Portugal’s first act after victory was to invite their old trading partner – the Bakongo – to commence mop-up operations across the Kwanza River valley in order to clear out the defeated Ambundu and bring them in chains to the new network of fortalezas, which not only served as troop garrisons and acropoli for the local inhabitants, but also as slave depots that accommodated the swelling numbers of captured Ambundu before being auctioned off and sent to Brazil.

With the defeat of the Ambundu the N’Dongo matriarchal dynasty fled east to their ally Matamba.  There, a royal refugee named N’Zinga M’Bandi betrayed the hospitality shown her by Matamba and began secret negotiations with Luanda for a return of the Ambundu to the Kwanza River valley.  N’Zinga M’Bandi secured agreements that not only deposed the sitting Matamban queen – handing her the crown by subterfuge – but also convinced the Portuguese to nullify their long standing trade monopoly granted to the Kingdom of Kongo which, in effect, established the Ambundu peoples in the slave procurement business.

The new Matamban queen made haste regarding her political and business affairs and quickly consolidated N’Dongo and the neighboring Kasanje states under her rule.  By 1619, Queen N’Zinga had grown her realm into the most powerful African state in the region using the wealth generated from her industrial scale slave procurement undertaking.  Within a few decade of Queen N’Zinga’s ascension, the regions surrounding central Angola were depopulated of not only the rival Bakongo peoples, but of its Ovimbundu, Ganguela, and Chokwe peoples too.

The lucrative Angolan slave trade not only flourished under female African leadership, but grew scientific and efficient and continued unabated until the Portuguese crown outlawed the colonial slave trade in 1869.  However, avarice and ingenuity always prevail so after this slavery prohibition a vibrant slave black market continued unabated as abolition only served to drive up the price of slaves and therefore the incentive to procure them in the field.  These lucrative smuggling operations from Angola lasted up until the day its primary customer Brazil abolished slavery in 1888.

Today the dominance of the Ambundu peoples in the business, political, and military affairs of modern day Angola is directly traced to the business acumen, organizational skills, and operational efficiency that the Ambundu peoples’ developed during their 269 year monopoly over slave procurement in Angola.  From the tens of thousands of their fellow African “brothers” and “sisters” that the Ambundu sold into slavery, they accumulated incredible wealth that enabled them to occupy a position of respect, influence, and near equality in colonial Angola unparalleled anywhere in colonial Africa.  They became, in a sense, the “Master Ethnicity” of the region.

Conclusion:

The North Atlantic slave trade was clearly an appalling atrocity, one that exacted a terrible toll on millions of innocent Africans leaving a legacy spanning generations, and it is rightly considered a blight on European civilisation. In no way do I seek in this post to diminish the abomination of this trade, nor the role Europe had in perpetuating it and particularly in providing the compelling market forces that essentially drove it. The above commentary, for reasons which will become apparent shortly, seeks to instead provide much needed context to this debate beyond a simplistic assessment, with an unduly narrow focus that characterises much that finds its way into the “mainstream” commentary of the history of slavery. A broader, and more realistic and grounded view of slavery, and acknowledgement of its very universality means that no race, nor civilisation on this Earth has not been tainted in some way by it, nor can anyone absolve themselves of responsibility and/or complicity in its historical evils.

Despite the North Atlantic Slave Trade being the most well known example (and, in the minds of the uninitiated, the only example) of slavery in the public consciousness, it is completely disingenuous to pretend that slavery of similar, if not even greater orders of magnitude in some instances, was not a universal and cross cultural problem, and one that is still, sadly, a present day issue to which the “enlightened” modern world routinely turns a conveniently blind eye. Europeans are neither blameless, nor should they be singled out for any particular criticism over and above any other culture, in perpetuating this horrific trade. The course of the history of slavery, as I have outlined above, has been ever-present since the beginning of recorded history, and probably dates from well before that.

In the present day, as far as we are able to ascertain and by conservative estimate, upwards of 40 million people globally are consigned to some form of slavery (including about 10 million children), while another 15 million are estimated to be in forced marriages or some other form of sexual exploitation. Very little of this modern slavery is to be found in the Western democracies.

The broader history of European slavery during the preceding two millennia – is that for centuries Europeans (particularly, but not exclusively Eastern Europeans) were far more often the victims of slavery (by the Romans, the Mongols and the Islamic world in particular) than they were perpetrators, and when they were guilty of such behaviour, more often than not it was characterised by a slave relationship, including a long history of medieval feudalism, formed between a Caucasian Master and Caucasian serf or slave.

The abolition of slavery, on the other hand, was largely driven initially by the English Quakers in the late 1700s, who formed anti-slavery groups that lobbied against the practice on the grounds that it conflicted with their Christian beliefs. Their lobbying gained traction, and significant court actions they mounted laid the groundwork, and by 1787 the Society for the Abolition of the Slave Trade was founded by Granville Sharp and Thomas Clarkson, along with William Wilberforce who as a member of British Parliament became a pivotal figure, giving many speeches on slavery in the House of Commons, including passing 12 motions condemning the slave trade.

This Utilitarian abolitionist movement in Britain led by 1807 to the Abolition of the Slave Trade Act, effectively outlawing British Atlantic slave trade. Whilst a momentous step that led to the imposition of fines being levied initially, a further act in 1811 made the act of slavery a felony, and the Royal Navy was then called in to intercept slave ships, freeing about 150,000 slaves over several years in attempting to police these provisions of the Slave Trade Act. In 1833, after much lobbying and debate, the Abolition of Slavery Act was finally passed, and then given Royal assent, ordering the abolition of slavery in all British colonies. Sadly William Wilberforce would die only three days after it was passed.

As part of this act of parliament, slavery was to be abolished in most British colonies, which in turn resulted in around 800,000 slaves being freed in the Caribbean as well as South Africa and a small amount in Canada. The law took effect on 1st August 1834 and put into practice a transitional phase which included reassigning roles of slaves as “apprentices”, a practice which was later brought to an end in 1840.

Sadly, in practical terms the Act did not seek to include territories “in the possession of the East India Company, or Ceylon, or Saint Helena”. By 1843 these conditions were lifted. A longer process however ensued which not only included freeing slaves but also finding a way to compensate the slave owners for their “loss of investment”. The British government sought around £20 million to pay for the loss of slaves, many of those in receipt of this compensation were plantation owners from the higher echelons of British society. Whilst many are justifiably critical of this “compensation”, it should more correctly be seen to have been a necessary, pragmatic (if unpalatable to put it mildly) and ultimately effective final step that was essential to end this terrible trade once and for all. As evidence of this, most of the other European powers followed suit either very shortly thereafter, or within the next few decades at the very latest.

The role that Britain (as the pre-eminent colonial power at that time), and the Utilitarian abolitionists that drove public opinion, played in passing these acts of parliament should not be underestimated. Most of the world at that time, and for centuries before, had viewed slavery in its various forms as a natural facet of human society, one that was either approved of, or at least tacitly accepted by the majority of people across almost all cultures and nations.

The reason that I believe we need to properly emphasise the universality, across all cultures, of slavery practices prior to the abolitionist movement of the Utilitarians, is to make it clearly apparent that slavery is an ongoing, real and present problem across the globe, and therefore selectively focussing on one single, albeit prominent example 2 or 300 years ago effectively diverts us away from addressing the eerily similar abominations that are occurring in the here and now.

Similarly, as we are sliding inexorably toward a more and more authoritarian future, where the routine subjugation of individual freedoms is becoming ever more readily accepted by people who would otherwise consider themselves as “progressive”, it should remind us that the possible path to falling afoul to our personal and collective subjugation at the hands of those in positions of power and influence can never be underestimated, nor should we naively expect that such atrocities could not once again reassert themselves, even in an “enlightened” Western “democracy” that considers itself above and beyond such a prospect ever conceivably eventuating.

The course of human history over the last 4000 years or more, as I have outlined in the article above, would clearly suggest that it is a complete folly to make such an assumption, and our acquiescence to such a future is dependent on our ability to fully comprehend slavery’s dark past, its present day prevalence, and therefore its likelihood of re-institution by means of stealth and misdirection into the future.

The popular perception of slavery needs to realign with the reality of the past and the present, in order to prevent it returning in significant degree into the future. Such misconceptions include:

  1. Slavery is a product of the dim, dark past of the human experience, and no longer relevant in the modern world – Slavery is a real and present blight on the modern world, and there are anywhere upwards of 60 million+ people in various forms of slavery at this present moment in time. From a million or so Turkic Uighurs in Chinese forced labour Internment Camps in Xinjiang Province, to young women in sexual slavery (particularly in Asia and Africa), to forced child labour in much of West and sub-Saharan Africa (e.g cobalt mining in the Congo), to forced marriages globally, to the Beydanes in Mauritania, to the religious slavery in the Sudan and Nigeria, and to the various forms of human trafficking practiced in various parts of the world, the trade and exploitation of humans is ever present, and largely ignored by those in the West not directly affected by such appalling infractions on human rights and freedoms.
  2. Slavery is a consequence of racism and white privilege, and African Americans were by far the most prominent victims of this trade – Historically, this is far from accurate, or remotely comprehensive. Slavery in the past has been much more likely to have been derived from the spoils of war and conflict, and largely has been motivated by civilisational dominance, sectarianism, tribalism and human monetary greed, rather than specifically by racism. Whether one is perceived as inferior, or “other”, because of one’s differing religious beliefs, being of a different caste or tribal group, or merely because of differences in social and/or economic class, the motivations and philosophical underpinnings of the perpetuation of slavery are far more complex and varied than such a reductionist view would have us believe.
  3. Slavery is almost exclusively the province of those of privileged Caucasian heritage and imposed upon those of poor and oppressed African heritage -Anyone, of any race, religion or creed could potentially be enslaved, if the conditions are right and the power is in the hands of those willing and ruthless enough to seek such exploitation. As I have shown above, no race has the monopoly (or even a significant over-representation when compared with others) on either being the perpetrator, or the victim of slavery. Also, where slavery was prevalent in sub-Saharan Africa, far from being poor and oppressed, those African kingdoms had a vast wealth that was almost beyond the imagination of their contemporary (or even most modern) European societies.
  4. Slavery in the non-Western world was a mild, benign and non-economic institution – Slaves have always been, across all races and cultures, subjected to brutal punishments, torture, castration, sexual exploitation, and arbitrary death. What is clear and readily notable by anyone with knowledge of history is the remarkable similarities in the status and treatment of slaves across various eras, and across multiple disparate cultures.
  5. Most of the Slaves in the North Atlantic Slave trade were imported into what is now the United States– In fact, >90% of these slaves were exported to the Caribbean and South America, not the United States, with Portuguese Brazil having been the recipient of the largest influx of slaves by a significant margin.
  6. Slavery was largely the product of Capitalism and Western Civilisation – as one can see from what has been written above, this practice dates back to the dawn of civilisation, and even possibly before, and has been in various institutionalised forms across multiple races and cultural groups ever since. Western civilisation, and the Enlightenment in particular, has done more to attempt to put an end to this blight on humanity than any other civilisation, whilst Capitalism has done more to lift people out of crushing poverty, a significant driver to modern slavery in particular, than any other single motive force.

You Will Own Nothing, and You Will Be Happy! (Smiley Face Emoji)

“You’ll Own Nothing and You’ll Be Happy” is a quote derived initially from a 2016 essay attributed to Danish MP and World Economic Forum Young Global Leader, Ida Auken. This quote has since become indelibly associated with the prognostications, and some would suggest the aspirations of the World Economic Forum, who incorporated the quote into one of their promotional videos under the title: “8 Predictions for the World in 2030”. So, who or what exactly is the World Economic Forum, and what significance does this organisation hold, if any, for the present geopolitical situation we are currently experiencing, and the potential future we might ultimately have to confront?

Background of the World Economic Forum:

The World Economic Forum (WEF) is an international non-governmental and lobbying organisation (i.e. an NGO) based in Cologny (Geneva), Switzerland. It was founded in January 1971 by Klaus Schwab, a German engineer and economist, changing its name from the European Management Forum to its present title in 1987. The WEF is mostly funded by its 1,000 member companies, which are typically global enterprises with more than $USD 5 billion in turnover, as well as by public subsidies. The WEF views its own mission as “improving the state of the world by engaging business, political, academic, and other leaders of society to shape global, regional, and industry agendas“.

The WEF is mostly widely known by laymen for its annual meetings held in Davos, a mountain resort in the eastern Alps region of Switzerland. The meetings bring together 3,000 paying members and various selected participants – among whom are investors, business leaders, political leaderseconomists, celebrities and journalists – for up to five days each January to discuss various global issues across 500 sessions.

In addition to the Davos meetings, the WEF convenes regional conferences in locations across Africa, East Asia, Latin America, and India and holds two additional annual meetings in China and the United Arab Emirates. It “produces a series of reports, engages its members in sector-specific initiatives and provides a platform for leaders from selected stakeholder groups to collaborate on multiple projects and initiatives”.

The Goals and Ethos of the World Economic Forum:

The central tenet of the World Economic Forum, and its underlying ethos is that a globalised world is best managed by a self-selected coalition of multinational corporations, governments and civil society organisations (CSOs), which it expresses through initiatives like the “Great Reset” and the “Global Redesign“. A more antidemocratic (being, as they freely admit, “self-selected”), autocratic and self-serving corporatist model is difficult to imagine.

The WEF believes that the concept of independent nations is obsolete, and must be replaced with a global government, which controls all. The WEF is therefore a fundamentally anti-democratic organisation with “globalist” (i.e. supranational) views. It is an anti-free enterprise, anti-free market group which attempts to subvert Western values and political processes for its own purposes.

It is a driving global force designed to achieve a new economic model, which Klaus Schwab refers to as “stakeholder capitalism”, which has uncomfortable similarities to both “crony capitalism” and Mussolini’s model for fascism. Schwab does not deny that the current capitalist shareholder model (where corporations are answerable to the shareholders) has been very successful and that it has boosted the world economy, moved billions of people out of poverty, and raised living standards over the course of the last century. He argues, however, that this model must change to “better serve the public”, which seems on the surface to be a noble aim, but is found wanting under closer scrutiny.

This concept of stakeholder capitalism is fundamentally undemocratic since it will result in the control of society by a handful of “elite” players, such as world leaders and business moguls, aided and abetted by economists, health bureaucrats and environmental experts: a new global aristocracy of the “enlightened” few. The WEF members believe that they, and they alone have the knowledge, and the authority to control the world’s economy and societal norms in what they perceive as the public’s best interests.

Recent periods of global instability, principally exemplified by the Global Financial Crisis of 2007–2008 and the recent COVID-19 global pandemic, are seen by the World Economic Forum, by their own admission, as potential windows of opportunity to intensify its programmatic efforts to “reset” the world in its own image. Clearly, the WEF is principally concerned with capitalising on moments of conflict and adversity in the world to ruthlessly pursue its agenda, attaining its ultimate goals even at the possible expense of the greater good for those societies involved, importantly without any input or oversight from the voting public.

Criticisms of the World Economic Forum:

The World Economic Forum and its annual meeting in Davos have received much legitimate criticism over the years, particularly of late, including such complaints as:

a) the organisation’s corporate capture of global and democratic institutions, in which the WEF promotes a public-private hybrid model (i.e. Corporatism), where “selected agencies operate and steer global agendas under shared governance systems, mediated via a captured and controlled United Nations,

b) its institutional “whitewashing” initiatives, appropriating such catch cries as “sustainability”, “social entrepreneurship” and “environmental protection” as cover for their true plutocratic goals,

c) the public cost of security, with the cost of security being estimated at between $9 and 10 million, of which Swiss taxpayers foot more than 25-30% of the bill for absolutely no public benefit flowing to them,

d) the organization’s tax-exempt status, an admittedly scandalous misappropriation common to many other similarly egregious NGOs,

e) unclear decision processes and membership criteria, maintaining a dense network of corporate partners that can each apply for different partnership ranks within the forum, but where there is an overrepresentation of financial companies, and an underrepresentation of such things as health care and information technology businesses, a bias being somewhat indicative of its overall attitude, tendencies and outlook,

f) a lack of financial transparency in their financial accounts, where neither income nor expenditures are made clearly available for public scrutiny, and

g) the environmental footprint of its annual meetings is horrific with around 1,300 private jets descending on Davos for their annual meeting, whilst simultaneously lecturing the world about the urgency in combating the alleged climate crisis as an existential problem for humanity caused by our collective “carbon emissions”.

“Do as I say, not as I do!” would seem to be their over-riding ethos, demonstrating once again how some of the world’s most privileged people can consign the lion’s share of humanity to under-privilege in perpetuity, and still carry on their conspicuously lavish and carbon intensive lifestyles completely unchecked by conscience, or even a modicum of self-awareness.

Why It Matters:

So, why should we be remotely concerned about some “obscure” European lobby group and think tank, no matter how grandiose its name and ambitions, and its outwardly absurd plans for realigning the global socio-economic landscape?

A 2017 quote by WEF chairman Klaus Schwab, where he boasts of “penetrating the cabinets” of various nations, and further claiming to have roughly 50% of the Canadian cabinet under PM (and former WEF Young Global Leader alumnus) Justin Trudeau who have aligned with his organisation’s goals and aims, sheds a light on the potential far reaching and impactful influence of this globalist group.

In 1993, Herr Schwab founded the Forum for Young Global Leaders, which identifies young people in both business and the political sphere world wide, whom they hope will be placed in influential roles in future governments and institutions, and over whom the WEF may have ultimately some influence.

A cursory glance at the pedigree and penetration of these acolytes amongst Western governments makes interesting, and perhaps even alarming reading. WEF Young Global Leader alumni include Canadian Prime Minister Justin Trudeau, Canadian Deputy Prime Minister and Minister of Finance Chrystia Freeland, NDP Leader Jagmeet Singh, Governor of California Gavin Newsom, New Zealand Prime Minister Jacinda Ardern, former Chancellor of Germany Angela Merkel, former Australian Minister for Health and Aged Care Greg Hunt, French President Emmanuel Macron, and former UK Prime Minister Tony Blair. Other significant people of influence include the co-founders of Google Sergey Brin and Larry Page, owner of Time Magazine Marc Benioff, co-founder of Wikipedia Jimmy Wales, Facebook founder Mark Zuckerberg, co-founders of Paypal Max Levchin and Peter Thiel, and VP at Microsoft Lila Tretikov, amongst a veritable cornucopia of political, entrepreneurial, investment and thought leaders who span the globe and across the spectrum of potential influence in various fields of Science, including Artificial Intelligence and Robotics, and Medicine, many of the latter playing central roles in the recent COVID response.

From this abbreviated list, it can be readily seen that the WEF, since its inception, has worked studiously and tirelessly to establish, and then to entrench its influence in the corridors of power in the West, and has done so with resounding success. With at least one CEO of Blackrock (Larry Fink– with $USD 9 trillion in assets under management), and the heir to the Rothschild banking cartel as additional members, and Microsoft founder Bill Gates as an important “Agenda Contributor”, these “young leaders” are positioning themselves quite nicely to becoming the future technocratic aristocracy of the world, assuming the role of guiding humanity into the 21st Century and beyond.

Hiding In Plain Sight:

How has this organisation come to be so influential, and its tentacles become so widespread, without raising more than the odd murmur of disquiet, largely derived from fringe websites and “conspiracy theorists”? With Klaus Schwab doing his best impersonation of a Bond villain, he and many of those he has mentored have stated plainly and openly in speeches, seminars and in print for all to see just exactly what their aims for global governance are in black and white. They spell it out, hiding it in plain sight.

This is a particularly cunning ploy on their part, as they rely on the apathy, and the incredulity of the majority of people in the West, and the complicity of the mainstream legacy media to keep the plebeians suitably anaesthetised to any contrary opinions or concerns. They use foreshadowing as a technique to disarm the populace through drip feeding their ideas in carefully couched narratives designed to placate one’s natural abhorrence to what they are actually proposing, and to allay any fears as to the potential pitfalls of concentrating such power in the hands of the few.

Many people who are peripherally aware of the WEF and its ambitions are absolutely certain that these best laid plans they have openly espoused will never come to pass. Rest assured, however, that their plans are indeed in motion as we speak and these things that will “never happen” are coming to a jurisdiction very near to you, dear reader, in the not too distant future. The WEF and the Davos elite are deadly serious, and they mean to implement their agenda by hook or by crook, with the help of their little acolytes dotted throughout Western governments and institutions around the world. Life as we have come to know it will no doubt never be the same should they succeed.

What is the Agenda of the World Economic Forum?:

The World Economic Forum’s own websites, and glossy brochures and presentations, paint a picture of an organisation striving to make the world a better and more equitable place for all of us to live better and more fruitful lives. However, closer scrutiny of the detail of some of their most ardent preoccupations and their ideological predispositions gives one a sense of foreboding as to the true motivations, and the likely potential outcomes, including possible unrecognised pitfalls, of their agenda.

Among the most central ideological underpinnings of the World Economic Forum is its focus on action on “climate change”, reducing the carbon footprints of humankind, so called green energy, and what they term the circular economy.

A Circular Economy is “a model of production and consumption, which involves sharing, leasing, reusing, repairing, refurbishing and recycling existing materials and products as long as possible”. The “Circular Economy” aims to tackle global challenges as climate changebiodiversity loss, waste, and pollution by emphasising the design-based implementation of the three base principles of the model. “The three principles required for the transformation to a circular economy are: eliminating waste and pollution, circulating products and materials, and the regeneration of nature.”

Whilst superficially this idea of a circular economy model may seem laudable, the likely pitfalls of such an ideology is that it ignores the fact that many products, materials and waste are not amenable to a circular economy, and cannot remotely be recycled, repurposed, repaired or refurbished. The realities of the intrinsic obsolescence of many of those everyday goods that we utilise in a modern society where base metals and alloys rust, bearings and tyres wear out, electrical circuits fail, and batteries (which are a focus for the “green economy” that allegedly underpins this model) reach the end of their useful life quite rapidly, are indeed a barrier to such a model because the cost of recycling these products is often highly cost prohibitive, and importantly they are energy intensive, and those recycling options will be the first thing to go in times of pending energy scarcity and/or insecurity.

Given the costliness of such approaches, the risk to businesses of this previously untested economic model could be potentially existential to their viability. There is a lack of clarity, to be generous, as to whether the circular economy is more sustainable than the traditional linear economy, and what its social benefits, if any, might possibly be. The linear economy makes up the vast majority of global economic activity, and much of this is not remotely fungible.

Absolute decoupling of economic growth from supply and demand, and the commensurate environmental degradation impacts may well be impossible, and would likely cause more economic and social harm than any good that might flow from it. The totality of resources used, and the environmental impact of trying to keep materials in the economy by reprocessing them, might well be greater than simply discarding them, and then making the same material or item de novo.

Another emphasis of the WEF is the establishment of the “share economy”- where owning items is supplanted by renting everything instead, with many of the erstwhile necessities of life reverting to a subscription model, one where one must hire or subscribe to a service or to have access to a product, one that will likely be either in short supply or only intermittently available, and which can no doubt be cancelled at any time should one deviate from expected societal norms. One such much hoped for example would be the drastic reduction, or outright elimination of private ownership of homes, turning the rich, Western democracies into a society of renters. That this aim conforms neatly into one of the most cherished desires of Neo-Marxists everywhere is a confluence that cannot be overlooked or underestimated.

In concert with these ideas, the WEF has simultaneously advocated a far-reaching global digital ID system that will collect as much data as possible on individuals and then use this data to determine their level of access to various services. The WEF suggests that this data collection dragnet “would allow a digital ID to scoop up data on people’s online behavior, purchase history, network usage, credit history, biometrics, names, national identity numbers, medical history, travel history, social accounts, e-government accounts, bank accounts, energy usage, health stats, education, and more.”

Once the digital ID has access to this huge, highly personal data set, the WEF proposes using it to decide whether users are allowed to “own and use devices,” “open bank accounts,” “carry out online financial transactions,” “conduct business transactions,” “access insurance, treatment,” “book trips,” “go through border control between countries or regions,” “access third-party services that rely on social media logins,” “file taxes, vote, collect benefits,” and more.

The WEF also advocates an “Alternate Credit Score” (ACS) system that they claim “will ultimately lead to greater economic inclusion” by using AI and machine learning to leverage “unconventional consumer information” in addition to traditional credit reports to predict “creditworthiness”, including social media profile, psychometric assessments and location data, in addition to assets assessments and utility payments, etc. Perhaps one should be more concerned about the obvious potential for punitive financial exclusion on the basis of this “unconventional” data collecting (i.e. political opponents or dissidents as but one example), rather than any possible improvements they allege would occur in “inclusivity”.

This further dovetails into another prospective development being advocated by the WEF, that being the formation of a Central Bank Digital Currency, replacing the $US dollar and other national currencies with a centralised and wholly digital currency. This places the tools of control across the globe into the hands of a select number of banker elites and their minions, which would then likely be tied to one’s social credit scores, with even a proposition for expiration dates being placed on one’s wealth to remove any semblance of individual choice or autonomy. The merging of digital currency, panopticon surveillance & social credit scores into an integrated system will lead inevitably to its increasing use and eventual omnipresence in our societies, where facial recognition and biometrics are merged with our online identities, browsing histories, social media presence, and our financial data, where our economic security would no doubt hinge on rigid adherence to “right think” and “socially just” actions and beliefs.

Finally, the WEF has been highly active in promoting ESG (Environmental, Social and Governance) adaptation for investors, where ESG scores for various businesses and other entities is tied to investment worthiness, independent of concepts of profitability for shareholders, or economic viability as an enterprise. This seeks to introduce a set of principles through which “free enterprise” is filtered, our perhaps more accurately shackled – where the “social justice agenda” informs economic choices as the first and foremost consideration, and where a uniformity of values is expected, with no deviation allowed and no correspondence entered into.

According to a report titled “Liability Risks for the ESG Agenda” by Washington D.C. law firm Boyden Gray, companies that take part in such coordinated actions against other companies or industries could be violating U.S. Antitrust Laws. The report states, “Federal law prohibits companies from colluding on group boycotts or conspiring to restrain trade, even to advance political or social goals.” Antitrust laws in general, and the Sherman Act in particular, are the Magna Carta of free enterprise. “They are as important to the preservation of economic freedom and the free-enterprise system as the Bill of Rights is to the protection of our fundamental personal freedoms.”

ESG also clashes with U.S. law regards the legal obligation of fund managers and corporate executives to act in good faith and in the best interests of investors and shareholders. In addition to this risk that ESG asset managers violate their fiduciary duty to investors, corporate managers of those businesses themselves are also forced to violate their duty to act in the best interest of the shareholders of the company.

Diversity, Equity, and Inclusion (DEI) programs, a component of ESG, are also coming under fire, and quite rightly, both as mandatory employee training and as hiring criteria. ESG could also run afoul of U.S. race discrimination laws, where the push for racial and gender equity could well be seen to violate the Civil Rights Act of 1964, which prohibits any discrimination on the basis of race, colour, sex, religion, or national origin.

Other Western nations clearly have similar laws, and similar foundational principles to the U.S., at least in theory, although clearly the legal systems in some of these countries are somewhat more pliable or more fragile than others, and therefore more amenable to acquiescing to such potential infringements of the principle of one law for all, to be applied equally without fear, favour or discrimination.

Altruistic Philanthropists or Deluded Narcissists with an Inflated Sense of Their Own Infallibility?

As can be readily seen from the above, the World Economic Forum is at the vanguard of debating, and more often than not advocating for a transformative global agenda that traverses many aspects of society, and in theory the collaboration of some of the world’s brightest minds, the most highly successful business leaders, and some of the most prominent members of the global political and financial elite should be encouraged. This would appear to be preferable to any ad hoc, piecemeal approach taken by individual nations, who often tend to conflict with or contradict each other in any attempt at concerted and efficient approaches to regional or global problems, or to the challenges societies are likely to face into the future.

The litany of problems with such an elitist, top down organisation like the WEF, with such a broad reach and deep penetration into the bureaucratic apparatus of the governments of multiple nations, should be obvious to any student of history, or to anyone who is remotely aware of the universally corrupting influence of absolute power.

Even the most well intentioned and otherwise altruistic people can be seduced by an inflated opinion of their own intelligence, and this hubris can encourage a sense of infallibility when promoting ideas of societal transformation that might seem superficially appealing, but simultaneously blind one and all to any potential pitfalls or the possible harms that such grand schemes could conceivably produce, harms that would be readily obvious to any comprehensive and analytical appraisal of these propositions.

Being surrounded by a self-selected cohort of ideologically aligned people, all with similar collectivist world views and socioeconomic backgrounds, creates an environment conducive to “group think” that is self-reinforcing, and which actively discourages any dissenting voices, or even healthy and robust debate. Add in a generous helping of narcissism, a character trait over-represented amongst the elite achievers, movers and shakers, and you have a recipe for potential disaster, even with the best of intentions.

Over and above the arrogance and narcissism of the nominal global “elite”, who have often become all too enamoured with their self-perceived intellectual superiority and their tightly held and much beloved beliefs, it is important to acknowledge that such people, by definition, are completely detached from the common man. Those comprising the would be global aristocracy tend to gravitate to other such people in the same intellectual space and strata of society, many if not all of whom have similar experiences that are either divorced from the concerns of the working classes, or completely oblivious to the desires, concerns or aspirations of these people who comprise the mass of humanity.

Not only are these privileged few detached from the aspirations and outlook of working class people, whom they often look down upon as either too ignorant, unintelligent, bigoted or inconsequential to care about, but they are also detached, both practically and intellectually, from the consequences of any decisions they decide to make allegedly on their behalf. Thus when elite, supranational organisations like the WEF presume to take actions to transform society, the last consideration they remotely entertain is the effect these actions will have in practical terms on their fellow human beings.

One need only look at the bureaucratic mandarins that populate the upper echelons of the European Union or the United Nations to see the consequences of such institutionalised detachment, where these centralised top down command structures are often making substantive decisions, but are so distant and disconnected from those whom they allege to serve that they remain utterly oblivious to the ramifications such decisions might entail for those “at the coalface”.

The Evidence of Our Eyes is Cause for Concern:

The track record of the WEF and those aligned to them does little to inspire confidence in the altruism and good intentions of the organisation, no matter what benign and benevolent image they might care to project in their allegedly idealistic pursuit of global transformation.

The WEF has been full throated in supporting the institution of “net zero” compliance across the Western world primarily, with a suite of “Climate Change” policies entailing such things as carbon taxes, bans on fertilizers, shutting down of standard energy production, the marketing of costly and inefficient electric vehicles that threaten to deplete the power grid and dwindle mining resources, and in the legislating of largely unworkable and ultimately extortionate renewable energy generation infrastructure, whose main effect has been to impoverish both producers and ordinary citizens in bringing about an alleged “transition” that increasingly resembles a road to nowhere (as I have elaborated upon in a post elsewhere on this blog, under that very title).

As is often the case in recent times with those who aspire to a supposedly idealistic or Utopian mindset, this often goes hand in glove with a Malthusian and/or a misanthropic point of view that perceives the broad swathe of humanity as somewhat expendable for the greater good of the “brave new world” that they hope to achieve. The common man is merely grist for the mill in achieving the larger and far more noble goals of those enlightened few destined to shape the global future. The resultant supply chain disruptions, government-induced food and fuel shortages, rampant inflation and economic upheaval are clearly a small price to pay when there is a “planet to save”, and a new Elysium to be founded in their image.

The COVID 19 Response- A Chance to Peer Behind the Curtain:

As a prelude to what we might expect under the “guidance” of this new World Economic Forum techno-aristocracy, one need only look to the recent COVID response for clues as to the likely path they will traverse once sufficient power is concentrated in their hands. The COVID pandemic response was, in large part instigated by, and/or vigorously implemented at the behest of various acolytes of the WEF.

WEF aligned politicians in the UK, Canada, Australia and New Zealand (e.g. Justin Trudeau, Jacinda Ardern, Matt Hancock, Greg Hunt, etc.) were demonstrably amongst the most ardent exponents of some of the most draconian and heavy handed responses to the COVID 19 pandemic, advocating the imposition of hard lockdowns for society as a whole for indefinite periods, as well as enacting severe restrictions on international travel, blanket school closures, and the implementation of COVID passports, vaccine mandates and universal masking requirements (even for children), as well as establishing purpose built quarantine centres and, troublingly, instituting a form of medical apartheid— measures that effectively closed down the public life and economic structures of entire nations, leading to widespread loss of livelihoods, physical and psychological illness, and which produced a culture of fear and dread that caused innumerable social problems due to a spiralling increase in the prevalence of domestic violence, substance abuse, depression and suicide.

These were all foreseeable problems and pitfalls that were the almost certain consequence of such unprecedented measures, and yet those charged with making those decisions seemed largely blind and even callously indifferent to these likely outcomes, and they have thus far largely avoided any significant culpability or recrimination for their part in what can only be described as a complete debacle.

There were numerous alternate sources pointing out that the “science” around COVID 19 was actually far more nuanced, and these facts often ran entirely contrary to the “expert” advice and much of the mandate zealotry, but these alternate sources of information were deliberately silenced by a vast mainstream and social media campaign.  Evidence since, in the cold light of day, suggests that mainstream media platforms worked in tandem with Big Tech, the CDC and the US political establishment to quell any and all dissent, regardless of merit. 

Amongst the main players in this concerted campaign to ensure that only one side of the argument was presented, and the narrative maintained regardless of any deleterious effects or even demonstrable (often entirely predictable) harms, were the aforementioned WEF “Agenda Contributor” Bill Gates, and many of the Big Tech founders and CEOs mentioned previously, and several WEF aligned politicians named directly above who were in positions of decisive influence, particularly in the UK, Canada, Australia and New Zealand.

Many of those WEF acolytes advocated for some of the harshest, and sometimes even Dickensian punishments for any deviations from COVID mandates, such as withholding medical care of any kind for the unvaccinated, house arrest and/or fines for the unvaccinated, imprisonment for questioning the efficacy or safety of the COVID vaccine, forced quarantine in purpose built facilities, and even having the children of vaccine hesitant forcibly taken from them for the crime of being unvaccinated. This, of course, was in addition to the mandatory loss of gainful employment for a large proportion of such people, not just in the medical field or the Aged Care sector, but even for those in solitary occupations such as truck drivers or sanitation workers where personal choice (no matter whether one agreed with that choice or not) would have had little or no impact whatsoever on public health.

Some of those with the most hardline attitudes to lockdowns and mandates were subsequently caught flouting those same rules themselves, taking overseas trips to holiday destinations, partying behind closed doors whilst others were locked down within their homes, and being maskless when the cameras were switched off- actions which gave the impression of “mandates and restrictions for thee, but not for me”.

For the WEF true believers, the concept of one’s bodily autonomy being sacrosanct, or the right of a citizen to self-determination, or even having the right to conscientiously object to a medical procedure, or in taking personal responsibility for one’s health decisions were entirely anathema to them. Meanwhile, the freedom to speak out and to debate the merits or otherwise of public health policy was sacrificed as it was considered counterproductive to the greater good, and these rights should necessarily be suspended given the alleged urgency and seriousness of the COVID 19 pandemic situation. This was highly questionable and unprincipled behaviour on moral and ethical grounds, and as time has gone on, it has become increasingly clear that such policies were built on shaky intellectual and scientific foundations.

The COVID Response as a Template for Exploiting the Climate of Fear:

This authoritarian COVID 19 pandemic response, whether motivated for good or ill depending on your perspective, established a number of unfortunate precedents that are likely to be exploited at the first opportunity when the next crisis, real or perceived, comes to the fore. The fear created by the potential existential threat of a global pandemic, exposed how readily the broad mass of Western society willingly complied with the most over-zealous, and often entirely unreasonable restrictions, many of which made little or no medical sense, and which often caused significantly more harm than good in the long term. It also established how readily, and unquestioningly the majority deferred to those perceived to be, or represented to them as “expert authorities”, and how hostile many of these same people became to any dissenting voices or opinions that challenged their preconceived ideas, or the narrative fed to them by such “trusted sources”.

The majority of people in our society effectively stopped thinking of the maintenance of their health as being one of individual responsibility and personal choice. Societal solidarity came to eclipse the worthy aspiration for individual autonomy, while the need for feeling a sense of security in the face of a perceived danger was elevated in importance over any and all concerns for human liberty, embracing instead the concepts of “public health” and “collective obligation”, even though many of those choices being promoted (masking of children and COVID vaccinations preventing transmission for example) were of dubious merit and had little or no impact on the health outcomes of others.

This mindset can readily be translated, by those with the power and inclination to do so, from health mandates to other forms of “collective responsibility”, particularly as it might relate to protecting the natural environment and, as an extension, the nebulous aspiration of “protecting the climate”. Whilst human health and protecting or rehabilitating the natural environment are undoubtedly very worthy goals, it is important to realise that a holistic approach to these matters requires considering all of the potential impacts that any mitigating action or response might have on the lives, and the livelihoods of individuals within our society.

Curbs on individual behaviour and resource use to allegedly serve “public health” or the natural environment involve an unhealthy confusion of politics and “science”, and are often motivated by activism rather than a genuine concern for people or the environment, particularly by those who would consider themselves part of the global “elite”. The Covid-19 pandemic response template might conceivably turn out to have been a mere dress rehearsal for wholesale “climate action” mechanisms to attempt to offset a highly speculative, allegedly human induced “global warming”.

Even if there was universal agreement on these climate problems arising wholly from human activity, which is highly speculative in and of itself, the debate should then turn to the possible efficacy of the gamut of remedial actions that are presently available on the problem at hand. The quest for solutions to any alleged crisis should begin with an understanding that many government interventions are more likely than not to cause more problems than any appropriate remedy they might produce.

The End Game:

As we currently spiral into a global economic abyss of our own making, it is sobering to see how just we have got to this point, how many principles we have had to bypass in the process, and how many people in political power and positions of influence like the WEF have worked so doggedly to undermine the aspirations and interests of the people they allege to serve. As we prepare to combat prospective threats that might arise in the future, our collective ideological blinkers have become blinders, and the people guiding us through this minefield are either malignant and narcissistic demagogues, cognitively impaired puppets, corrupt careerists looking to game the system, or all of the above.

Our society has been, in recent times, very successfully divided on political, ideological, racial, religious and ethical lines because we have, in the fog of ideological warfare, lost sight of those basic fundamental principles that should ideally unite us. Western civilisation as we know it, one that has been studiously built up by previous generations over the last hundred years or more, will almost certainly fall apart, and in the not too distant future, because we have not treasured and protected it. We have instead allowed it to be systematically gamed, corrupted, undermined and white-anted from within by the Davos elite and people just like them, whilst engaging in word games with ideological opponents and cheering on entirely unsuitable “representatives” in our parliaments to perpetuate its destruction.

We have stood on the shoulders of giants (our forebears), only to piss on their legacy from a great height.

It now falls to men of conscience and those with eyes to see to provide practical assistance to those less fortunate, to highlight the difficulties faced by real people in the real world, and to promote that the true measure of an individual is not found in the colour of his or her skin but in their qualities as an individual. It falls to these men to champion freedom and the rights of the individual, and to fight against the inefficient and callous edifice being constructed by an ever burgeoning Big Government bureaucracy that is thirsting for more and more power and influence with less and less delivery or accountability.

It falls to these conscientious souls to find real, practical solutions for indigenous peoples to bring them out of the spiral of poverty that they experience, rather than miring them in the morass of race identity and race politics that will only inevitably serve to divide them further from the broader society, and thus further entrench their disadvantage.

The fool’s paradise of economic prosperity to which we have become so accustomed over the last 50 years is about to come to a precipitous and traumatic end. The current Ukrainian situation, and subsequent sanctions on Russia directly after the COVID 19 response, are threatening to crush global energy and food security, whilst supply chains also threaten to grind to a halt as all the interconnected pieces of the global economy are only as strong as the weakest link.

This provides convenient cover for the systemic and bureaucratic and political incompetence that has characterised Western governance over that period. The stock, commodity and money markets have become totally disassociated from the real value they are meant to represent, the small business engine room of the economy has been decimated (in the correct use of the term), and rampant quantitative easing mimicking the excesses of Weimar Germany has hollowed out the productive and value adding elements of the global economy, substituting a speculative bubble of unimaginable proportions that is set to burst when all the main players have found a safe haven from its worst effects.

The result will be wealth concentration even greater than the egregious levels we are experiencing now, where the 1% become the 0.0000001% who have all the marbles. Many of the Davos elite are positioned to take maximum advantage of just such an economic calamity should it arise.

The road to serfdom for the rest of us is the path we are now travelling down, aided and abetted by our political masters at the WEF and their ilk, and we are past the event horizon where there is likely nothing more we can do to alter this outcome. We are inevitably headed for an increasingly authoritarian future, with supranational organisations like the World Economic Forum at the forefront. The time to make a stand against that has likely already passed.

The foundations for panopticon surveillance state have already been laid down, with Communist China as its template and testing ground, and social credit scores will be our future, whether we like it or not. We in the West are already in the midst of our own version of the Cultural Revolution, we are already detaining dissidents in near indefinite detention without trial, and so it’s only a matter of time before the 5 year plans, the social credit scores and all the other adornments of a communist dictatorship become accepted practice in what was once the “free world”.

On a Road to Nowhere – The Renewable Energy Delusion:

From AA Martin (Source: Shutterstock.com)

As the vast majority of Western democracies career headlong down the path to renewable energy “nirvana”, the inherent contradictions and incongruities of this “net zero” dogma are becoming ever clearer, and the road we have collectively embarked upon has increasingly come to be seen to be a choice between a road to nowhere, a freeway overpass to a precipice, or a one way street to a blind alley, abruptly terminating in a dead end.

The current Ukraine invasion by Vladimir Putin’s Russia, and the sanctions imposed by the West in response, has only served to bring the potential calamities of this approach into stark relief, where the sudden absence of a mere 10% of the world’s “fossil fuels” (i.e. hydrocarbons) has led to severe supply chain issues, runaway inflation (due in large part to the rising energy input costs of the renewables transition), and to pending widespread energy rationing and grid failures. And this is only the beginning of the profusion of pending trials and tribulations about to be unleashed in pursuing the renewable energy delusion.

No amount of failure, current or impending, can possibly dissuade the energy transition ideologues from denying the reality about to confront all of us, nor to persuade them of the folly they have all encouraged us to pursue in the delusional desire to control the world’s climate through energy policy. Two decades or more of aspirational policies, at the expenditure of $5 trillion USD by conservative estimate, with extensive subsidies for “renewable energy” generation coupled with punitive encumbrances to hydrocarbon exploration and generation, has not only failed to facilitate this much ballyhooed transition, but global hydrocarbon production has actually increased in spite of these policies, whilst global Solar and Wind generation has failed to exceed even 2% of global energy generation.

Wind and Solar generation, coupled with Battery storage are advocated as the solution to our alleged, and highly speculative global climate problems, and yet the experience gained over the last 20-30 years has been that these modalities are not “free” or even remotely cheap, are not inherently “clean” or even independent of hydrocarbon requirements themselves, are not in any sense “renewable”, and are entirely incapable of being surged in time of need, or even of being depended upon to be reliably available at any given moment in time.

Being so lacking in dependability and dispatchability makes complete reliance on Wind and Solar generation a fool’s errand, and the pursuit of this net zero mirage can only lead to the destruction of Western societies that took hundreds of years of hard toil, sacrifice and heartbreak to build. The insistence on minimising or avoiding altogether the use of hydrocarbons in the generation of electricity, in heating homes, in manufacturing goods in factories, and in transporting people and goods from one place to another, jeopardises the viability of the vibrant economies of the West that underpin these prosperous and previously relatively harmonious societies.

A Word on Dispatchability, and the Failure of Logic:

According to some recent mainstream media narrative, renewable energy advocates have claimed that “the sun and wind are a known reliable source for dispatchable power. This is irrefutable.” Nothing, of course, could be further from the truth, but then logic rarely enters into the equation when zealotry is the common denominator.

Dispatchable”: Adj. “A dispatchable source of electricity refers to an electrical power system, such as a power plant, that can be turned on or off as required; in other words they can adjust their power output supplied to the electrical grid on demand”.

Wind power generation cannot be dispatched on demand when the wind doesn’t blow, nor when it is blowing too hard, nor in the Northern Hemisphere during severe blizzards, nor in many other adverse weather conditions. The Solar power generation sources, similarly, cannot be provided on demand on days where there is minimal sun due to cloud cover, or when deployed at high latitudes for a large proportion of the year, and they certainly cannot be dispatched at night, on any night.

To illustrate, if (for the sake of argument and round figures), we have a nameplate capacity for Wind or Solar in New South Wales, Australia of 1000 MWe, on the 1st of December 2022 upcoming, then how much power can we rely on receiving on that day at 10 a.m? Even knowing my power demand in advance, and knowing that the average Capacity Factor is 25%, will it be 2% or 50% on that day and at that time? The answer is that it is entirely unable to be predicted in advance with any degree of certainty.

Therein lies the problem. Inherent and irredeemable intermittency and unreliability. One cannot predict what power generation an energy grid, and the businesses and services that depend upon it, will have on a given day and a given time because we have made ourselves at the mercy of the elements, just like the cave dwellers of old.

Wind and Solar- Beyond Redemption:

Large Scale Wind Farm (Land Based)

Wind and Solar power are given prominence in the mainstream media and amongst alleged “experts” as supposedly suitable as large scale, base load alternatives to efficient fossil fuels like natural gas, oil and coal.

Both technologies suffer from irredeemable intermittency that intrinsically makes them unsuitable for the provision of 24/7 base load power. They also introduce unwanted and often dangerous instability into power grids, in both voltage and generated power, while delivering often a fraction of their nameplate capacity factors (~15-25% on average depending on age and location factors), and note that these averages say nothing of the predictable delivery of power at a given moment in time- an important concept where these modes of generation fail hopelessly.

Off Shore Wind Farm (Source: stopthesethings.com)

All power distribution systems need to be able to cope with varying load, and in most cases this is supplied by “spinning reserve”, that can be brought on stream very quickly. Thus, one needs a constant power input, usually from coal, gas or hydroelectric, or nuclear generators in order to maintain the spinning reserve to keep pace with load changes, and maintain the frequency at 50Hz (60Hz in US territories). The frequency is very important, not only as a time signal, but because for example a lot of medical equipment relies on that frequency for its operational accuracy.

In the case of Solar and Wind, we have another variable supply predominating, which adds another level of complication that exponentially increases as the proportion they supply to the grid increases. Solar and Wind generation produces a Direct Current output that must be then converted to Alternating Current at the right frequency and in synchronisation with the main supply frequency. The main supply is, of course, used as a reference signal, but what happens to Wind and Solar supply, when that signal is not present?

“Renewables” are often claimed, erroneously, to be the “cheapest form of power”. This completely ignores the billions upon billions of dollars of subsidies and extra infrastructure investment in poles and wires, grid stabilisation, battery storage and fossil fuel back up required to maintain the illusion that Wind and Solar are inexpensive. Similarly, to merely compare costs 1:1 between base supply and the add-on supply (even then the costs are often actually higher than their fossil fuel equivalents, especially when subsidies are taken into account) is a complete misrepresentation, because one always needs that base load supply to provide the reference frequency and be on hand to cover for hourly, diurnal and seasonal supply shortfalls. One must always keep this spinning reserve going, so that it can respond to sudden peaks in demand.

So those costs must be considered on both sides of the equation, but seldom if ever are these costs factored in by renewables advocates, nor by the government regulatory agencies reputedly responsible for determining the cost-effectiveness and viability of this technology. As such these government instrumentalities have failed in their duty of care to the taxpayers in not showing due diligence, in mandating the transition from efficient and low cost, predictable sources to highly expensive and dangerously intermittent and unreliable sources.

Meanwhile in Germany, the home of the Energiewinde folly of extensive Wind power investment, power bills rose 43% in the first six months of the year due to wild fluctuations in unreliable renewables, producing too much at times requiring load shedding to neighbouring countries at negative prices (i.e. paying them), whilst having to pay a premium spot price to mainly France when the wind didn’t blow and the sun didn’t shine, which was often.

The plain and simple fact is that both Wind and Solar have an EROI (Energy Return On Energy Invested– an important engineering concept) that is too low to support a functioning modern technological society  (Solar PV without storage 3.4, and with storage 1.6, Wind without storage 16, but with storage 3.6). An EROI of 9 or above at a minimum is required to allow a society to function sustainably, and where there is sufficient power generated to even replace the power generating infrastructure as it reaches the end of its operative life. To make matters worse, Wind and Solar do not allow societies in any way to safely eliminate, at the current state of technology, more than an insignificant amount of their base infrastructure (including the provision of coal fired or natural gas base load capacity) in spite of the massive capital investment made in these renewable plants all around the world.

These “renewables” are thus inherently reliant on fossil fuel back up of the virtually same nameplate MW capacity (e.g. natural gas, coal or nuclear), introducing a layer of redundancy to the system. Wind and Solar systems generate too little over their limited lifespans to sufficiently offset the energy used in their manufacture and distribution and maintenance, and they are prohibitively expensive (as a consequence of their inefficiency, cost of manufacture and infrastructure gold-plating) to the point of rendering those industries and commercial interests reliant upon them increasingly uncompetitive and uneconomic.

The government subsidies used to artificially prop up this ineffective and impractical technology have been systematically misappropriated, and they have failed in their alleged task of making most if not all of these ventures competitive or at least viable. “Renewables” have failed to garner the expected technological innovation required to allow large scale storage of any power generated, nor to scale up these facilities to a level sufficient to provide more than an expensive token percentage of our overall power needs, without importing it from elsewhere.

In addition, neither wind or solar can truly be considered renewable, they have far shorter life spans than advertised (often as little as 10-15yrs – compared to coal fired power stations at 50yrs+) and are a looming toxic burden on the environment, exemplified by rare earth metals mining required for the magnetos used in wind turbines (lifespan 10-15yrs with diminishing power yields after 4-5yrs), or the highly toxic and carcinogenic substances (Arsenides, Lead, Cadmium, Chromium VI, Sulphur Hexafluoride, Thiourea, CCl4, SiCl4, Selenium Hydride, Germane, etc) required in the manufacture of photovoltaic solar panels (Note- over 4 tonnes of toxic waste and chemicals is required for every tonne of photovoltaic material manufactured) and the upcoming issue of their disposal en masse at the end of their life within the next decade. While not necessarily more toxic than alternative modes of generation, they are neither as “clean” nor as “renewable” as pretended by advocates.

World’s Largest Solar Farm in India (648MW to power ~150,000 homes) (Source: electrek.co)

In Solar PV technology, the physics boundary for silicon photovoltaic cells, also known as the Shockley-Queisser Limit, is that there is a maximum conversion of 33.7% of photons into electrons; whilst the very best commercial PV technology today already exceeds 26%. Therefore, Solar technology is in fact approaching the theoretical limit of its ability to become even more efficient than it already is.

The physics boundary for a Wind turbine, known as the Betz Limit, is a maximum capture of 59.3% of kinetic energy in moving air; and commercial wind turbines today already exceed 40%. Once again suggesting that it has approached its theoretical efficiency limit and that what we have now is almost certainly as good as it is ever likely to get. Moore’s Law may apply to silicon chips and computerisation, but has no relevance to the theoretical limits of renewable power generation strategies.

Wind-farm capacity factors have improved marginally in more recent times at about 0.7% per year; this small gain comes mainly from reducing the number of turbines per acre leading to a 50% increase in average land used to produce a wind-kilowatt-hour.

Batteries at the current level of technology are no better, either. No digital computer-like 10x gains will ever exist for such batteries, because the maximum theoretical energy in a pound of oil is 1,500% greater than maximum theoretical energy in the best pound of battery chemicals.

It takes the energy-equivalent of 100 barrels of oil to fabricate a quantity of batteries that can store the energy equivalent of a single barrel of oil. About 60kg of batteries are needed to store the energy equivalent of 1 kg of hydrocarbons. At least 300-500kg of materials are mined, moved and processed for every kilogram of EV battery fabricated. Storing the energy equivalent of one barrel of oil, which weighs 136 kg, requires 9,000 kg of Tesla batteries (at a cost of $USD 200,000).

A battery-centric grid and car world means mining gigatons more of the earth to access lithium, copper, nickel, graphite, rare earth metals, cobalt, etc.—and using millions of tons of oil and coal both in mining and to fabricate metals and concrete. Carrying the energy equivalent of the aviation fuel used by an aircraft flying to Asia from the U.S would require $60 million worth of Tesla-type batteries weighing five times more than that aircraft itself.

One has to mine more than 250,000 kgs of material to make a single 500kg EV battery, at the state of current technology. This is likely to increase even further as mineral ore grades diminish as demand increases and resources deplete over time. Demand for those minerals (Lithium, Cobalt, Nickel) used in the manufacture of EV vehicle batteries will increase in the coming decades by anywhere between 400 and 4000%, depending on the “success” of the transition. There’s not enough mining activity or capability in the world to make enough batteries for all those people globally to make the promised transition to Electric Vehicles at any significant proportion of the total vehicle population.

As the demand for these minerals escalates dramatically as the transition to renewables evolves, so too will the costs of these finite raw materials increase to render the already expensive battery technology increasingly uncompetitive, far more than offsetting any marginal efficiency gains that this technology has made in the last decade.

Batteries also don’t generate energy themselves, contrary to public perception, but only store it on a very limited basis. If Tesla’s Gigafactory (the largest Lithium battery factory in the world) maintained its present day production for 1000 years, it would only make enough batteries for about two days’ worth of the current U.S. electricity demand.

Furthermore, when it comes to energy storage, it costs less than $1 a barrel to store oil or natural gas (in oil-energy equivalent terms) for a couple of months. Storing coal is even cheaper than that. Meanwhile, with batteries, it costs roughly $200 to store the energy equivalent to one barrel of oil.

China dominates global battery production with its grid 70% coal-fueled: Electric Vehicles (EVs) using Chinese batteries will create more carbon-dioxide emissions than saved by replacing oil-burning engines. With EVs, you don’t eliminate emissions, you just export them in manufacture, and transfer them to the energy grid generators in daily use. Across its life span, an EV is likely to lead to a net increase in CO2 emissions over its lifetime compared to its internal combustion engine counterparts at least until up to 90,000km in driving distance, by which time the 450kg (up to a max. of 900kg) lithium battery is due for replacement.

Wind and Solar: What a Waste:

A further claim has been made by renewable advocates: “…… that solar panels are over 95% recyclable, and renewable energy is actually driving electricity prices downwards.”

In the real world, however, the opposite of this has been universally the case, where the vast majority of solar panels have not been, and will likely never be recycled, due to the economic realities and engineering limitations involved. Similarly, renewable energy has universally driven power prices up, not down, in direct proportion to the penetration of renewables in a nation’s energy mix.

By 2050, it has been predicted that globally we will be having to dispose of more than 2 million tons of wind turbines and 6 million tons of solar panels every year.

One reason the world may be throwing away so much not-so-renewable waste is that recycling it costs ten times as much as what can be recovered from the recycling process.

Who would have thought that collecting low density energy in extreme environments would create megatons of tough, non-biodegradable infrastructure, embedded with toxic heavy metals?

This is the hidden cost of our mad dash for wind power – producing thousands upon thousands of decommissioned blades that are so difficult to recycle, that they are just dumped as landfill, giant carcasses that future archeologists can excavate and ponder the intellectual limitations of those responsible.

From the UK DailyMail,

Scientists at America’s National Renewable Energy Laboratory have warned that in the next few decades, the world faces a ‘tidal wave’ of redundant blades that will number ‘hundreds of thousands, if not more’.

By 2050, it’s predicted that the world will need to dispose of two million tons of wind turbine blade waste every year. In the UK, the volume already exceeds 100,000 tons per year.”

The Problem of Scale, the Achilles Heel of Renewable Energy Schemes:

From Roger Pielke:

In 2018 the world consumed 11,743 mtoe (million tonnes of oil equivalent) in the form of coal, natural gas and petroleum. The combustion of these fossil fuels resulted in 33.7 billion tonnes of carbon dioxide emissions. In order for those emissions to reach “net-zero” therefore, we will have to replace totally the ~12,000 mtoe of energy consumption expected for 2019.

To achieve net zero carbon emissions by 2050, we would need to build the equivalent of 1 large nuclear power plant, or 1500 wind turbines every 3 days just at present demand

There are roughly 11,000 days left until January 1, 2050, so that is a minimum of 16,500,000 wind turbines between now and then, realising that with their 20 year lifespan the majority of them will require decommissioning and replacement in the meantime, so that number is essentially double that. 

When you realise that each wind turbine costs $USD 500,000 to decommission, that is $USD 8.25 Trillion just on the decommissioning of wind turbine generators to be built from now on that will run out of their lifespan before 2050 alone, and that doesn’t take into account the costs of decommissioning all fossil fuel related generators that will have to be accounted for as well. 

Add in the costs of building a large utility scale wind turbine (range from about $1.3 million to $2.2 million per MW of nameplate capacity installed), multiply by 33 million (16.5 million x2), and then add in storage for batteries like the $Billion one in South Australia, plus the costs of such schemes as Snowy 2.0 for storage to partially overcome intermittency, plus back up generation with gas and diesel when these generators don’t provide, and you are looking at hundreds of trillions of dollars, when the entire global GDP is $40 trillion.”

It is the walking definition of a pipe dream.

From Mark P. Mills is a senior fellow at the Manhattan Institute, a McCormick School of Engineering Faculty Fellow at Northwestern University:

“1. Hydrocarbons supply over 80% of world energy: If all that were in the form of oil, the barrels would line up from Washington, D.C., to Los Angeles, and that entire line would grow by the height of the Washington Monument every week.

2. The small two percentage-point decline in the hydrocarbon share of world energy use entailed over $2 trillion in cumulative global spending on alternatives (exclusive of subsidies paid) over that period; Solar and Wind today supply less than 2% of the global energy.

3. A 100x growth in the number of electric vehicles to 400 million on the roads by 2040 would displace 5% of global oil demand.

4. Renewable energy would have to expand 90-fold to replace global hydrocarbons in two decades. It took a half-century for global petroleum production to expand 10-fold.

5. Replacing U.S hydrocarbon-based electric generation over the next 30 years would require a construction program building out the grid at a rate 14-fold greater than any time in history.

6. Eliminating hydrocarbons to make U.S electricity (impossible in the near term, not remotely feasible for decades) would leave untouched 70% of U.S. hydrocarbons use—the U.S uses 16% of world energy.

7. Efficiency increases energy demand by making products & services cheaper: since 1990, global energy efficiency improved 33%, the economy grew 80% and global energy use is up 40%.

8. Efficiency increases energy demand: Since 1995, aviation fuel use/passenger-mile is down 70%, air traffic rose more than 10-fold, and global aviation fuel use rose over 50%.

9. Efficiency increases energy demand: since 1995, energy used per byte is down about 10,000-fold, but global data traffic rose about a million-fold; global electricity used for computing soared.

10. Since 1995, total world energy use rose by 50%, an amount equal to adding two entire United States’ worth of demand.

11. For security and reliability, an average of two months of national demand for hydrocarbons are in storage at any time. Today, barely two hours of U.S national electricity demand can be stored in all utility-scale batteries plus all batteries in one million electric cars in America.

12. Batteries produced annually by the Tesla Gigafactory (world’s biggest battery factory) can store three minutes worth of annual U.S. electric demand.

13. To make enough batteries to store two-day’s worth of U.S. electricity demand would require 1,000 years of production by the Tesla Gigafactory (world’s biggest battery factory).

14. Every $1 billion in aircraft produced leads to some $5 billion in aviation fuel consumed over two decades to operate them. Global spending on new jets is more than $50 billion a year—and rising.

15. Every $1 billion spent on datacenters leads to $7 billion in electricity consumed over two decades. Global spending on datacenters is more than $100 billion a year—and rising.”

The Australian Experience:

In spite of being constantly pilloried by activists for its lack of “action” on “Climate Change”, Australia has invested more on renewable energy per capita than any other nation, failing to learn from the experiences of other Western nations that have gone down this same road with remarkably consistent degrees of failure. Between 2017 and 2020, Australia deployed 12x more Wind and Solar per capita than any other nation, and by that time it had the world’s highest per capita deployment of solar panels, and was in the global top 10 for Wind farm installation.

This has been a function of deliberate Australian Federal and State Government policy, which has been enacted and perpetuated via the National Energy Market (NEM) in 1999, and the Renewable Energy Target from 2007 onwards. As a consequence, consumer electricity prices in Australia, that had fallen significantly over the preceding 40 years (indexed to CPI), proceeded to double in a little over a decade after 2007, and have again increased by a further 50% in recent months, since the end of the restrictions enacted in response to the COVID 19 pandemic, and with the war in Ukraine as the excuse du jour.

The Australian Mandatory Renewable Energy Target (MRET), as the name suggests, MANDATES that power providers use a specified percentage from renewable energy sources, even if other sources are available and those are cheaper. In addition, under this scheme in Australia, at the time of writing those same renewable energy sources have market priority access.

As the government’s energy regulator boasts on their own website, the scheme (i.e. the MRET) “creates demand for electricity from renewable sources”. A truly free market doesn’t have to “create demand”, market forces (unencumbered by artificial restrictions imposed by governments) do. If you distort the market sufficiently that you make it uneconomic to maintain a coal fired power station properly, and you also force it to run inefficiently on demand because it has to be at the beck and call of the intermittency of renewable generation, then of course it will run at a reduced capacity.

According to AEMO (Australia Energy Market Operator) statistics, Wind power generation nationally has been known at times to run at as low as 3% of nameplate capacity. The question must then be asked: Why is that 3% considered acceptable, but yet 70% of nameplate capacity for often poorly maintained and economically hobbled coal fired generators is considered to be a failure?

By making coal fired power stations uneconomic through the MRET and the distortion of mandated market mechanisms, no investors in their right mind would therefore invest in coal no matter how good the economic environment, because they know that a change of government at the State or federal level could then change the playing field and ruin even the best business case on a whim.

Meanwhile, in Australia which has some of the world’s largest reserves of both Uranium and Thorium, which could easily underpin and promote a “nuclear revolution” moving forward, Nuclear power generation is well over 10 years away at a minimum, and would require a conversation with the voting public (and a change in the draconian laws of the land) that no one in the political space has the courage to have.

Add to that the case of “Snowy 2.0” pumped hydro scheme, a scheme touted as “a win-win project that will assist Australia in meeting its global commitments on Climate Change. It is good for renewables, will generate more power, lower electricity prices, and provide jobs and opportunities in regional Australia.” 

This is a massive, multibillion dollar expansion of the existing Snowy Mountains Scheme, and is a highly a complex engineering project, originally mooted to cost $AUD5 billion, but even in these early stages, costs are blowing out and it is likely to cost 5-10x that much once completed. Then again, when one is saving the planet from Climate Armageddon, what is a $billion or twenty here and there between friends?

The project involves linking two existing dams, Tantangara and Talbingo, through 27km of tunnels and building a new underground power station. “Water will then be pumped to the upper dam when there is surplus renewable energy production and the demand for energy is low, and then released back to the lower dam to generate energy when electricity demand is high

In other words, Snowy 2.0 is a fancy arbitrage system of buying power when cheaper to be generated later at a higher price, with the consumer paying the cost and the investors reaping the dividend.

This scheme is alleged to provide “flexible, on-demand power while reusing or ‘recycling’ the water in a closed loop and maximise the efficiency of renewables” by using “excess” solar and wind energy to pump water to the higher dam, to be stored for later use. Of course that firstly presupposes that there will ever be an excess of power to requirements from the grid once coal fired and other hydrocarbon power stations are removed, and “renewable” power is the only generating option. 

The flaws in the scheme are obvious to all but government experts.

About 40% of the energy generated is lost before it reaches consumers, more than even other pumped storage schemes due to the distance between reservoirs being far longer, and more than other storage options. So far from “generating power” (a bald-faced lie), it wastes significantly more energy in the process of moving water up the hill, than it generates on the way down.

It would also require substantial transmission works to connect Snowy 2.0 to the grid, costing $billions more than its advertised budget.

It will lead to more than 50 million tonnes of carbon dioxide emissions during the construction phase and the first 10 years of operation, the alleged reason for its construction in the first place.

From an environmental perspective, it will convert extensive areas of national park into a construction site, with permanent damage over thousands of hectares and the destruction of habitat used by 14 threatened species.

During times of drought, when dam levels are low, it cannot be used as advertised because dam levels won’t allow it, whilst in times of flood, it will have to be limited by the dam levels at the bottom as any outflows or overflows would endanger already swollen river systems and low lying areas surrounding these two dams.

Its cycling energy storage capacity of 350GWH (7 days continuously) is wildly overstated. If for example Tantangara is half full, “only 175 GWh (at most) could possibly be delivered, and when it is at minimum operating level Snowy 2.0 couldn’t generate at all“. 

If Snowy 2.0 were ever called upon to generate continuously for 7 days till Tantangara was emptied, it would likely be at a time of major problems in the National Electricity Market. At such times Snowy 2.0 is likely to have been called upon to generate in the days beforehand, thereby reducing water levels in Tantangara, and therefore the amount of on demand energy available. It would be most unlikely for 7 days of continuous generation to follow a period of absolutely no generation at all.

Once this Snowy 2.0 scheme has been fully discharged, it will have to draw on large quantities of power, at a time of possible energy shortfalls to pump water uphill again (with a loss of 40% in waste) to have further storage capacity restored for future requirements. 

The original Snowy 1.0 scheme and Tantangara are operated such that “ponds are low entering the winter period in order to address the risk of 1-in-10-year extreme inflows”. If this practice is maintained, then Snowy 2.0 will never be ‘fully charged’, at least in winter and early spring.

Wind and Solar, even with various forms of “battery” storage investment, will remain as unreliable, irredeemably intermittent and unfit for purpose as they always have been, so we in Australia have tied ourselves in a Gordian Knot where the only option is rolling power outages, energy rationing and ever increasing power prices that will send the most marginal, and even some not-so-marginal into energy poverty, to disproportionately impact on main street small businesses relative to their global corporation competitors, and to drive manufacturing businesses and various energy intensive industries to either close down or to move overseas to reduce their costs in nations where cheaper energy can be obtained.

Those consequences are ENTIRELY down to renewables advocates, who generally have no comprehensive understanding of what they are doing, and no conscience about the mess of the electricity system they are making, nor do they care for the innocent victims they are going out of their way to create.

The Texas Winter 2021 Grid Failure- A Case Study in the Impracticality of Wind and Solar at Scale:

In February 2021, the state of Texas suffered a major power crisis, which came about during three severe winter storms sweeping across the United States on February 10–11, 13–17, and 15–20, depending on the region.

The storms triggered the worst energy infrastructure failure in Texas state history, leading to shortages of water, food, and heat. More than 4.5 million homes and businesses were left without power, some for several days, in blizzard-like conditions. At least 246 people were killed directly or indirectly, with some estimates as high as 700+ killed as a result of the crisis. 

When these blizzard conditions hit Texas, and an increase in demand was required to provide extra heating, etc. Wind generation went from supplying up to 40+% of demand to less than 8%, and often far less than that. Gas supply had to ramp up rapidly to meet that demand, and nearly managed to do so. 

Yet according to some, this is not a savage indictment of renewable energy generators that go missing right when they are needed most.  

The root cause of this grid failure crisis is that the state of Texas bought lock, stock and barrel into the Anthropogenic Global Warming/Renewable Energy ideology

They believed that scaling up their wind generation to the 5th largest in the world would ensure their supply was secure for energy independence into the future, regardless of the circumstances – They were wrong.

They believed that with all their capacity factors on tap that they had no need for an inter-connector to other states, and simultaneously they could decommission many of their coal fired power plants because capacity factors for Wind and Solar are just as reliable as those for Hydrocarbon (fossil fuel) generation – They were wrong.

They believed that they didn’t need de-icing equipment for their wind turbines because in a “warming world” they’d never see snow again – They were wrong.

Whether the government is Republican or Democrat apparently matters very little. Once you have drunk the renewable energy Kool Aid, mistakes like these just seem to follow as surely as night follows day.

The central, and undeniable fact remains that the state’s Wind power generation went missing right at the time when it was most needed. Gas was ramped up significantly to meet demand, but couldn’t quite compensate for Wind generation dropping off the perch. It was, and remains a savage indictment of the effectiveness, or lack thereof, of Wind power generation in cold spells like this. 

This crisis serves to validate what those sceptical of renewable energy dogma have stated all along- that Wind and Solar generation requires nearly the same Capacity Factor of Hydrocarbon generation as they potentially provide, to be used as back up in case they fail to deliver at any given point in time. 

How do we therefore propose that a modern, technological society, dependent on reliable 24/7 delivery of its energy needs, can run on the whims of weather dependent, intermittent power generation? 

The War on the Working Man Goes Regional- Net Zero in Action:

Nothing quite highlights the futility, and the danger of the path we have set ourselves down quite like the effect of Net Zero policies on the Agricultural sector. The regional farms that provide the majority of the food and sustenance for Western nations to survive and thrive are under siege from Climate Change warriors, who view their activities in feeding and clothing us as planet destroying activities that must be abruptly and ideally completely curtailed.

The 2022 Dutch Farmer’s protests highlight the dangerous route such policies are traversing, bearing in mind that the Netherlands is one of the most agriculturally intensive, farming-centric economies in the world, and these proposals by the Rutte government to slash nitrogen emissions comes in the context of the Russian invasion of Ukraine, where economies involved that produce much of the global wheat production is threatening food security in Europe and beyond, as well as simultaneously reducing the availability of many of the raw materials (and natural gas for the Haber-Bosch process) that make up most commercial fertilisers: nitrogen, phosphorous and potassium. It’s a perfect storm that threatens food supplies to millions of people, with a whole continent facing widespread hunger moving forward if they persist in their zealous pursuit of “carbon reductions”.

From the BBC:

Dutch government proposals for tackling nitrogen emissions indicate a radical cut in livestock – they estimate 11,200 farms will have to close and another 17,600 farmers will have to significantly reduce their livestock.

Other proposals include a reduction in intensive farming and the conversion to sustainable “green farms”. As such, the relocation or buyout of farmers is almost inevitable, but forced buyouts are a scenario many hope to avoid.

The cabinet has allocated €25bn (£20bn) to slicing nitrogen emissions within the farming industry by 2030, and the targets for specific areas and provinces have been laid out in a colour-coded map.” How quaint, and colour co-ordinated of them.

The Dutch farmers are by no means alone in having to battle faceless bureaucracies sacrificing their aspirations on the altar of emission reductions and other green schemes.

Not coincidentally, and almost simultaneously, farmers from across Germany descended on Berlin in June 2022 to take part in a very similar mass protest against the government’s agricultural reform package that foresees a reduction in the use of fertilizers, pesticides and insecticides with a view to curbing the dying of insects and reducing the nitrate content in the groundwater.

In August, 2022 farmers in northern Belgium attempted to break into the town hall of Hoogstraten after staging a mass protest in response to their government’s similar EU-driven agenda to cut nitrogen emissions.

The European Union’s Natura 2000 network, which mandates that the 27 member states must protect designated habitats in a “sustainable manner, both ecologically and economically,” and therefore must cut nitrogen emissions, is the driving force behind this agenda to curtail emissions in the agricultural sector.

From mid 2021, the farmers in Sri Lanka had been protesting vehemently against the former Rajapakse government’s sudden decision to reduce import costs by banning chemical fertiliser, pesticides and weedicides under the guise of transforming Sri Lanka agriculture into an organic producer. 

As a consequence, this once self-sufficient nation reels from the fall-out of this ill-conceived shift to organic agriculture, compounded by fuel shortages. This led, not surprisingly, to Sri Lanka’s worst-ever economic crisis, and before long the nation was facing widespread shortages of domestically grown and produced food, in addition to a shortfall in food imports.

In April 2022, the farmers joined the mass protests and demonstrations of other workers and the marginalised poor across the country demanding the resignation of Gotabhaya Rajapakse and his government. While these struggles forced Rajapakse to flee the country and resign as president, the mass movement was then betrayed by the trade unions, who were backed in this betrayal by Left wing activist groups. 

The Wickremesinghe government, which replaced Rajapakse, then ruthlessly implemented International Monetary Fund austerity measures and this has seen a worsening of the plight of workers, farmers and the rural poor.

Finally, in October 2022, New Zealand’s Prime Minister Jacinda Ardern introduced a world first tax on methane emissions from livestock, a so called “burp tax” which is likely to devastate primary industry in that agriculturally intensive economy. Widespread protests by farmers are once again likely to fall on deaf ears.

Farmers around the world will, if the mania to offset “carbon” and other emissions is carried through to its logical net zero conclusion, in future be paid to NOT produce agricultural produce and meat, as they will be carbon and nitrogen taxed out of existence for the sin of their “emissions”.

So not only will we see, in the case of Australia for example, a nation with abundant mining and mineral resources unable to exploit them, but we will also see a nation with abundant agricultural land and productive capability that will not be able to feed itself. Genius.

Conclusion:

Both wind and solar technologies suffer from their irredeemable intermittency that intrinsically makes them unsuitable for the provision of 24/7 base load power. They also introduce unwanted and dangerous instability into the power grid, and are an added redundancy given the need for near total backup capacity with the fossil fuels they are meant to replace. So, as has rightly been pointed out by those sceptical of the renewable energy mantra, the true costs of renewables is “what you give up to get it”.

These trade offs are indeed too great with Wind and Solar “renewables”, because the cost of their extensive provision causes energy poverty for a significant proportion of the population, and it makes energy intensive manufacturing and industries uncompetitive, forcing them to either fold, or move their companies (and the associated employment opportunities and tax receipts) offshore.

Well over a billion people in various parts of the developing world have no access to power or running water. Windmills, solar panels and expensive batteries will not lift them out of that poverty. Ever.

Climate change activists and renewable energy boosters advocating for “net zero” policies really have no concept of the crushing poverty that much of the world’s population struggles to overcome, and how one’s carbon footprint is exactly how that crushing poverty is overcome. They also have no concept of how much energy the world already uses, and how much more energy will be required for the world’s population to modernise their societies in the 21st century. Globally, only 1 in 20 people have a motor vehicle, whilst more than 80% of the world’s population have never taken a single airline flight.

Meanwhile, global climate related deaths are, in recent decades, at an all time low and have been on a precipitous decline since the middle of the 20th Century. It has never been a better time, from a climate perspective, to live in comfort and safety than that which we are experiencing right now.

A tax on energy is a tax on the working class. It affects them disproportionately, as it is intended to do. More people have been lifted out of poverty and fed and housed by hydrocarbon “fossil fuels”, and those products derived from hydrocarbons either directly or indirectly, than by any other single factor.

The coal gas storage domes that formerly were dotted around major cities in the industrialised world, were largely responsible for building the civilisation we now enjoy. Now, they have been rightly phased out, all in good time, as better technologies have come along to replace them, but it doesn’t diminish the important role they played in improving the lives of millions and millions of people since the industrial revolution.

Wind turbines, on the other hand, will no doubt become the modern equivalent of the Easter Island Moai statues, a lonely collection of totems standing in memoriam to a lost civilisation. These totems will tear down what has taken generations to build on the false promise of modernity, in a blind alley of technology not fit for purpose. The “brighter, better future” that was promised by the renewable energy transition, could very well strangely resemble that endured by the peasant classes prior to the industrial revolution.

The facts remain that government edict across the Western world, and the suite of policies they have mandated, have rendered what was once economically viable and sustainable suddenly uneconomic. It was entirely deliberate, ideologically driven sabotage. The alternatives are not fit for purpose, and so we are about to reap what has been deliberately sown.

Whilst private companies do respond to government incentives, it is absolute poison to an economy to introduce perverse incentives that advantage companies and people who are gaming the system, and the renewable sector is rife with these perverse incentives. It is a carpet bagger’s dream, where speculative practices without due diligence attract subsidies regardless of the likely outcome producing benefits to consumers. 

Free markets are the best way to organise economic activity, which is why the MRET, and schemes like it elsewhere, and a rigged market that disincentivises efficient fuel sources in favour of more inefficient and costly ones, is the exact opposite of a market- it is a top down command economy dressed up to resemble a market. The catalogue of failure globally of such central planning schemes is a litany of disaster, and is to be avoided like the plague by any sane and rational nation.

What Western governments seem to have forgotten in their zeal for a bright, shiny renewables energy transition future, is that mankind needs low cost and high reliability energy to survive, to prosper and for the sake of sociocultural, economic and geopolitical security.

Societies that are energy impoverished cannot maintain social order, cannot defend themselves from foreign invasion or from undue foreign or malign influence, cannot maintain adequate sanitation and protection of the environment, and cannot maintain sustenance, safety and security for their civilian populations to live at anything other than a hand to mouth existence, and one that is often brutish and short.

Only a tiny elite sector of society are effectively insulated against the full brunt of the negative consequences that will likely result from these expensive and inefficient “renewable” technologies replacing inexpensive and efficient ones like coal that have served our economy well for the last century or more.

Technology will eventually advance in spite of this current folly, and this will hopefully consign Wind and Solar to the backwater of scientific progress as a curious example of unbridled faith and mendacity trumping established engineering principles and due diligence that would once have snuffed the misguided fad out at its inception.

A House of Cards, Six No Trumps and One-Eyed Jacks Are Wild:

“House of Cards” TV Series Logo, with US Capitol Backdrop (Source: Wikipedia)

Donald J Trump. A brash, larger than life real estate mogul and billionaire New York business man. A one time TV presenter of the popular reality TV show “The Apprentice”, his inimitable sense of grandiose style and bombast was often the target of derision and hatred, and also the object of admiration and envy, often in reasonably equal measure.

Prior to his running as a serious Presidential candidate in the 2016 Presidential Election, it is fair to say that Donald Trump vacillated across the US political spectrum, from Republican to Democrat, from Libertarian to Independent to Reformist, and during the George.W. Bush era his alignment with the Democrats led him, ironically, into the orbit of, and to an erstwhile friendship with former President Bill Clinton and his wife Hillary Rodham Clinton.

Very few people, in the early stages of Trump’s candidacy for the Republican Presidential nominee, ever envisaged that soon he would come to be elected as the 45th President of the United States. Having predicted his victory even in those very early stages to any who would listen, and in spite of my personal misgivings about his character, I believe that I may be one of the few people who can be remotely objective in assessing all that has followed on from his election. I neither view him as the Messiah, coming down from on high to save the U.S and its Constitution from oblivion, nor to the contrary as the second coming of Adolf Hitler, such has been the polarised nature of political opinion as to his virtues or deficiencies.

Never has so polarising a figure, in my view, attained so high a public office in the U.S, and nor has such a figure ever prompted anything approaching the level of unreasoning hatred and delusional beliefs as Donald J Trump. The so called “Trump Derangement Syndrome” has become so omnipresent amongst the overwhelming majority of the “Liberal” Left, that this high level derangement is an almost essential component of one’s belief system if one is to consider oneself, or to be considered an acceptably “progressive” American politically, and failure to hold each and every accusation levelled against President Trump as sacrosanct is to consign oneself to the dark realms of the conspiracy theorist, or even worse to out oneself as a potential “Neo-Nazi” or “domestic terrorist”.

The term “conspiracy theorist”, in broad terms, serves two distinct purposes. Firstly, it allows any former or would be conspirators to go about their plots and schemes unchecked and unscrutinised, while it simultaneously allows the intellectually incurious to maintain their current level of delusion that all is right and proper with the world, completely untethered to reality. Therefore, it is hardly surprising that almost no one on the political Left in particular is prepared to look at all that has happened since 2016 with any attempt at dispassionate or objective analysis.

Therefore, at the risk of being labelled a conspiracy theorist, I intend to wear the epithet as a badge of honour and go to the heart of all the wrongdoings and terrible precedents that have been established, all on the pretext of neutralising, or otherwise bringing down the erstwhile (and possibly future) President Donald Trump.

Crossfire Hurricane”: The Revolving Doors of Never-Ending Inquiries and the Obstruction Traps and Impeachments:

When Donald Trump was elected in 2016 as the 45th President of the United States, his outsider status, and the fact that he was a billionaire mogul beyond the control of the “House of Cards” known as the Washington establishment, made him an existential threat to both the Republican and Democrat establishment (i.e. the “Uniparty”, who in spite of being in nominal “opposition” to each other, actually share many common interests and outlooks, and are more than happy to work co-operatively toward their shared statist goals, where the public interest is often a secondary, at best, consideration).

I must say, from the outset, that Donald Trump is a person whose undoubted narcissism and sundry other personality flaws I largely find distasteful on a personal level. However, my sense of fair play and justice would dictate that, if I felt that he was falsely accused of a crime, I would be staunch in his defence. If I thought he was the victim of a witch hunt, I would be honour bound to say so, despite any misgivings I might have regarding his personality.

After all that has occurred since 2016, I refuse to value my personal likes or dislikes above the principles being abused in the service of taking President Trump down. One needs look no further in the abuse of principle than the aptly named surveillance operation :“Crossfire Hurricane”. As the name suggests, a crossfire hurricane is a term derived, in the modern idiom at least, from the lyrics of the Rolling Stones song “Jumpin’ Jack Flash”, and refers to bandmate Keith Richards having been born in the midst of a German air raid. But more broadly, the phrase suggests an attack of any kind coming from all directions, where the person at the eye of the figurative “storm” cannot see where the next attack is coming from. In meteorological terms, as the eye of a hurricane passes over a fixed spot, the wind direction then changes 180 degrees and comes from the completely opposite direction. 

With that in mind, I believe that it is obvious that the name chosen by those intelligence operatives in the Federal Bureau of Investigation (FBI) was indicative of a well-laid plan they had put in place to hamstring then President-elect Trump, to attack him from every direction at once so that he could do little to defend himself from the onslaught, and to thereby stymie his effectiveness in his role as U.S Commander-in Chief, and more importantly to sabotage the implementation of his political agenda.

So it was that Crossfire Hurricane became the code name given for the counterintelligence investigation undertaken by the FBI from July 31, 2016, to May 17, 2017, into the allegedly multiple links between Russian officials and associates of Donald Trump, and “whether individuals associated with his presidential campaign were coordinating, wittingly or unwittingly, with the Russian government’s efforts to interfere in the 2016 U.S. Election”.

Unfortunately for all concerned, there was no factual basis for the FBI’s “Crossfire Hurricane” investigation into then-candidate for President Trump. The FBI’s counter-intelligence chief Peter Strzok opened this Counter Intelligence investigation into Trump’s campaign in July of 2016, based on false claims that his camp was working with the Russians, claims based largely on two elements: the first element being the false claim that one time Trump campaign advisor Carter Page was a Russian spy, when in fact he was merely a CIA source and asset providing intelligence on his Russian contacts. This salient fact was deliberately hidden from the Foreign Intelligence Surveillance Court (FISC).

Carter Page: When an CIA/FBI Asset becomes a Russian Agent:

“The Foreign Intelligence Surveillance Court (FISC) slammed the FBI on Tuesday in a rare public statement over the agency’s handling of former Trump campaign aide Carter Page’s warrant application and subsequent renewals”, according to an article in the Wall Street Journal.

“In order to appreciate the seriousness of that misconduct and its implications, it is useful to understand certain procedural and substantive requirements that apply to the government’s conduct of electronic surveillance for foreign intelligence purposes,” reads the statement. 

 “The FBI’s handling of the Carter Page applications, as portrayed in the Office of the Inspector General report, was “antithetical to the heightened duty of candour” required by federal investigators, adding “the frequency with which representations made by FBI personnel turned out to be unsupported or contradicted by information in their possession, and with which they withheld information detrimental to their case, calls into question whether information contained in other FBI applications is reliable,” wrote the court, which called the recent watchdog report from the DOJ’s Inspector General (Michael Horowitz) “troubling”.”

Furthermore, then FBI Director James Comey told the FBI in an email chain in December of 2016 that CIA Director John Brennan had insisted on including the “Steele dossier” (see below) in the Intelligence Court assessment, but the CIA is claiming that it was James Comey who made the recommendation and it was Brennan and DNI chief James Clapper who objected. Like Tweedle Dee and Tweedle Dum- each one points the finger of blame to the other, and inevitably (and conveniently) neither are to be held accountable for their actions!

Former FBI Attorney Kevin Clinesmith admitted to altering the Carter Page evidence to support the FISA warrant that was then used to spy on the Donald Trump campaign in 2016. Clinemsith altered an email from CIA investigators used to request a FISA warrant and subsequent renewals on surveillance of Trump campaign advisor Carter Page. Carter Page had previously worked as a source for the CIA, however Clinesmith falsely said Page was “never” a CIA source. Clinesmith received a minor slap on the wrist for his conduct.

Hidden in the Appendix to Inspector General Horowitz’s report into the FBI’s behaviour during the Russian interference investigations, there were a total of 51 Woods Procedures violations by then Director James Comey’s FBI that were identified. These “Woods Procedures” were designed to protect American citizens to “ensure accuracy with regard to … the facts supporting probable cause”, after recurring abuses in the past where the FBI had presented inaccurate information to the Foreign Intelligence Surveillance Court (FISC). Once again, in spite of these multiple violations, they were duly buried in the Appendix of the report, i.e. they were kept out of the main body of the report, and so no accountability for these serial infractions was brought to bear.

The Steele Dossier: The Real Russian Disinformation Campaign:

The second element to the predicate for surveillance on the Trump campaign in 2016, an egregious abuse of power that would become known as “Russiagate” (or “Spygate” in some quarters), was the entirely fanciful and by now discredited Steele Dossier, an amalgam of tawdry innuendo and scuttlebutt dressed up as foreign intelligence gathering, initially funded by the establishment elements of the Republican Party, but then subsequently taken over and financed shortly thereafter by the Hillary Clinton campaign and the Democratic National Convention, via an intermediary known as Fusion GPS. The allegations in the report were largely Russian disinformation, from dubious sources*, many of whom were either Russian aligned or anonymous, that were presented to the FISC as factual intelligence gathering rather than uncorroborated rumour and innuendo**, with the intent to facilitate the ongoing surveillance of the Trump campaign (in echoes of the Watergate scandal).

*The Inspector General Michael Horowitz’s report stated that “Steele himself was not the originating source of any of the factual information in his reporting.” Instead, the report found that Steele relied on a “Primary Sub-source”, later revealed as Igor Danchenko, who “used a network of further sub-sources to gather the information that was relayed to Steele”. Danchenko was recently charged with five counts of making false statements to the FBI on five different occasions (between March 2017 and November 2017) regarding the sources of material he provided for the Steele dossier, charges which he denies. Interestingly, Danchenko was on the payroll of the FBI as a paid confidential informant from March 2017 until October 2020, when this arrangement was abruptly terminated- read into that what you will, noting the coincidence of the termination date with the 2020 Presidential election.

** On 12 October 2022, comes the following: “During questioning from Special Counsel John Durham, Brian Auten, a supervisory counter intelligence analyst with the FBI, revealed the FBI offered Christopher Steele one million dollars if he could corroborate allegations in the Dossier, but that Steele could not do so. Auten repeatedly admitted under questioning from Durham that the FBI never got corroboration of the information in the Steele Dossier but still used it in the initial FISA application and in the three subsequent renewals.” (Source: Fox News)

The Alpha Bank- Trump Tower Connection (that wasn’t):

Just one example of the political jiggery pokery undertaken to produce a false Russian connection to President elect Donald Trump was the alleged connection between computer servers in the Trump Tower and a Russian bank: Alfa Bank, where all manner of secret dealings were meant to have gone on. As with most things related to the Russian collusion narrative, this was largely a smoke and mirrors exercise to promote the appearance of impropriety with little regard to any substantive evidence being present. The political ambit was one of flinging mud in the hope that some of it might stick. Of course, the compliant legacy media were only too happy to oblige, applying their usual lack of critical analysis to the allegations.

“The Hillary Clinton political machine pressured the FBI from a number of angles to ensure investigations (into a Russian Alfa-Bank/Trump Tower connection) were opened and reopened,” said one source who spoke on the condition of anonymity. “The deception was wide-ranging.”

Special Counsel John Durham has outlined in his investigation the FBI part of the scheme in a felony indictment of Clinton associate Michael Sussmann. The former Clinton campaign lawyer was recently charged with making a false statement to the former general counsel of the FBI when he claimed he was not working “for any client” in bringing to the FBI’s attention these allegations of a secret channel of communication between computer servers in Trump Tower and the Alfa Bank in Russia, allegations that were later shown to be without foundation.

In addition to the FBI, the 2016 Clinton campaign also tried to convince the Obama administration’s State Department, Justice Department and Central Intelligence Agency to look into the Alfa Bank hoax, and continued pressing the issue even after Trump was inaugurated in January 2017. Their goal was to trigger federal investigative activity targeting Clinton’s Republican rival, and to then leak this potentially damaging information to the media.

Clinton foreign policy adviser and National Security Adviser Jake Sullivan would later put out a written statement trumpeting the false Trump-Alfa Bank story, which was shared by then-candidate Hillary Clinton on Oct. 31, 2016, after Slate reported on it. Fusion GPS, the Washington opposition-research group that worked for the Clinton campaign as a paid agent, and helped gather dirt on Alfa Bank and draft the materials that Michael Sussmann would later submit to the FBI, reportedly pressed Slate to publish the story.

Impeachments 1 & 2:

The calls to impeach President Trump began (and from several quarters) immediately after his election in 2016, even prior to his coming into office, such was the level of despair and unvarnished anger and hatred to be found amongst Democrat aligned and “civil rights” groups at his election.

Two civil rights groups were even trying “to boot President Donald Trump from the nation’s highest office” when they launched an online campaign to get the brand new commander-in-chief impeached on the very day of his inauguration, a campaign which was given ample publicity, and tacit approval by the Washington Post and Time magazine. Their website,  http://www.impeachdonaldtrumpnow.org, went live just as Trump was officially sworn in. It was run by two groups, Free Speech for People and RootsAction, who believed that President Trump’s possible conflicts of interest were sufficient grounds for his ouster, even at this early stage.

This trend was soon to be replicated throughout Donald Trump’s presidency, where even the smallest perceived infraction, or suspicion of impropriety, no matter how minor or inconsequential, was deemed to be an “abuse of power” or “conflict of interest” worthy of overturning the will of the people, and a democratically elected President. These groups would soon fall into deathly silence when similar concerns could have just as easily been raised post the 2020 Presidential election about the incoming President.

Without going into laborious detail over Impeachments 1 & 2, suffice it to say that the prime motivations for both were almost entirely political, and precedents that were scrupulously adhered to in the impeachments of President Richard Nixon and President Bill Clinton were not maintained, the standard of factual evidence the Impeachment cases relied upon were sketchy at best, and due process and procedural fairness was run roughshod over in the furtherance of purely political goals.

In the first Impeachment case, for example, an initially unnamed whistleblower alleged to have overheard a phone conversation between President Donald Trump and Ukrainian President Volodymyr Zelenskyy, in which President Trump was accused of abusing his power as POTUS by asking the Ukrainian President to investigate alleged improprieties at an energy exploration company based in Kyiv known as Burisma, and its association with both VP Joe Biden and his son Hunter. The whistleblower, subsequently known to be Lt. Col. Alexander Vindman was only able to hear President Trump’s side of the conversation unilaterally and without context, and the Ukrainian President Zelenskyy himself denied that any undue coercion or pressure had been placed on him during the conversation.

The irony (and hypocrisy) of all of this is that Joe Biden himself, when he was Vice President in the Obama administration and in charge of the Obama administration’s policies in Ukraine, had himself placed undue coercion (on camera, and in personal communication) on the then Ukrainian President Petro Poroshenko to drop the Prosecutor General for the Ukraine, Viktor Shokin, from an investigation into the alleged corruption at energy investment company Burisma, a company where his own son Hunter Biden was one of the managing directors.

The second impeachment of President Trump on January 13, 2021, one week before his term expired, was hurriedly brought to bear for the crime of “incitement of insurrection” regarding the January 6th protests at the US Capitol that sought to delay or defer the certification of an election that the majority of protesters were convinced, rightly or wrongly, was illegitimate. With such a loose definition of an “insurrection” being central to this hastily cobbled together impeachment case, it is germane to note that this was an “insurrection” without a single firearm being wielded by the perpetrators. Notwithstanding some appalling behaviour by some of those protesters on January 6th, the hyperbolic charges of the President inciting violence or insurrection were without substance, whilst suggesting (on the flimsiest pretext) that President Trump had “threatened the integrity of the democratic system, interfered with the peaceful transition of power, and imperiled a coequal branch of Government” by his mere questioning of the validity of the election result in certain swing states, and then contesting their certification.

For those sceptical of the aforementioned analysis, the following assessment from Professor Jonathan Turley – Shapiro Chair for Public Interest Law at George Washington University- a nationally recognised legal scholar who has written extensively in areas ranging from constitutional law to legal theory to tort law, bears scrutiny:

“They’re going to argue we don’t have due process for Trump. Why make that argument real?” Those words from House Judiciary Committee Chair Jerrold Nadler, D-N.Y.,  stand out in the shocking disclosures in the recently released book, “Unchecked: The Untold Story Behind Congress’s Botched Impeachments of Donald Trump”, Politico Playbook co-author Rachael Bade and Washington Post reporter Karoun Demirjian recount how House Intelligence Committee Chair Adam Schiff and Speaker Nancy Pelosi overrode objections from Nadler that the lack of witness testimony was a denial of due process for then President Donald Trump. He put it plainly and correctly: “It’s unfair, and it’s unprecedented, and it’s unconstitutional.”

It was a strikingly familiar objection.  I testified at the first Trump impeachment before Nadler and criticized the lack of any factual witnesses or Judiciary Committee hearings supporting the articles of impeachment. The book details a position of the House Judiciary that is strikingly similar to my own testimony.

The book, however, has not brought a sense of vindication as much as frustration. Nadler publicly toed the line with Pelosi to support a process that he reportedly viewed as abusive and “unconstitutional” even as some of us were set upon by a legion of irate liberal pundits. Worse yet, the book indicates that the bar on witnesses was not compelled by the schedule as claimed by Pelosi and Schiff, but instead dictated by raw politics.  It was, I wrote, a decision to follow the rule of Franz Kafka’s character that “My guiding principle is this: Guilt is never to be doubted.”

On the second impeachment, they went one better. They jettisoned any witnesses (including legal experts) in what I called a “snap impeachment.”

During the impeachments, I suggested that the reason was not any limitation of time but tactical advantage. In both rushed impeachments, Pelosi then held back the articles of impeachment before sending them to the Senate – destroying even the pretense of exigency as the reason for abandoning due process.

The book appears to confirm the Kafkaesque logic. It states that neither Pelosi nor Schiff wanted to risk a witness or member going off script by allowing true due process. When Nadler raised historical and constitutional objections, Schiff reported barked back that he needed to watch “his tone” and complained “you’re putting us in a box.”  

That box is an effort to guarantee fairness and Nadler reportedly and correctly observed that “if we’re going to impeach, we need to show the country that we gave the president ample opportunity to defend himself.”

In my testimony in the only hearing held by the Judiciary Committee in the two impeachments, I objected that “this is wrong. It is not wrong because President Trump is right…No, it is wrong because this is not how an American president should be impeached.”

I relied primarily on the Nixon and Clinton cases to show how far the House was far outside any historical navigational beacons. It turns out Nadler and his staff reached the same conclusion and cautioned Schiff and Pelosi to “Stick close to the Nixon and Clinton cases.” They refused.

Dan Goldman, Schiff’s lead counsel and the Democratic nominee to represent New York’s 10th District in the House, scoffed and mocked Nadler: “Jerry Nadler? With him, everything is negotiable.” When Nadler’s team argued for an approach (as I did) “more like Nixon,” Schiff’s team reportedly dismissed due process and said, “F— Donald Trump.”

People can disagree on the merits of the impeachments, but both impeachments were an abusive use of the Article I authority in the denial of any substantive hearings before the Judiciary Committee. While it was constitutional in the sense that there is no required process, it was wrong from both a historical and procedural perspective.  Of course, the public was not allowed to either hear from witnesses or know that seven Democrats like the Judiciary Chair objected on these same grounds.

Indeed, when the House elected to pursue the January 6th investigation, they followed the same playbook with Schiff as a member.  Traditionally, each party is allowed to pick its own members on such committees. However, Pelosi rejected two of the Republican members and the rest of the party (except outgoing Reps. Liz Cheney and Adam Kinzinger) boycotted the hearings. The result was the same one-sided production without a hint of fairness or balance in exploring possible defenses or counterarguments.

What is most sad about this account is that for a critical moment Nadler rose to the occasion. He defended not just the historical authority of his committee but the constitutional norm, even for a president despised by Democrats. That twilight moment of clarity was soon lost. The book recounts how Nadler made an “effort to get back into Pelosi’s good graces.”

When I testified, there was not a hint of concern or dissent. Nadler and the Democrats scoffed at the notion that the impeachment departed from core historical precedent or legal protections.

They had, as Nadler predicted, made the due process arguments “real,” but no one cared. To paraphrase Goodman’s reported observation, in Washington, “everything is negotiable.”

The Mueller report- A Triumph of Rhetoric Over Substance:

Robert Mueller was generally portrayed in the legacy media as a man of impeccable credentials and entirely objective and above reproach

Yet, as Robert Mueller has since bowed out in the aftermath of his report into putative Russian collusion in the lead up to the 2016 Presidential election, I would make some personal observations on both his words and behaviour, bearing in mind that, as a lawyer, it is Robert Mueller’s stock in trade to use language in such a way as to omit certain details that might be inconvenient, or to cast aspersions without really saying so through manipulating words and phrases to convey an impression (even a false one) without being explicit, or to leave a door open for political purposes by framing his comments to that end. 

So I ask, would a man “beyond reproach” state as follows: 

If we had the confidence that the President clearly did not commit a crime, we would have said so.” 

Not only does this reverse the onus of truth under the law, implying that President Trump had some kind of obligation to not only prove that he didn’t commit a crime, but also to the extent that Mr Mueller was “confident” that he had not? That is a “How many times did you beat your wife” kind of statement- an attempt to smear when he had no proof worthy of the name. An appalling example of suggesting guilt by implication without the necessary proof required to make such a grave accusation toward at sitting President.

Would a man beyond reproach also so blithely state: 

When a subject of an investigation obstructs that investigation or lies to investigators it strikes at the core of the government’s effort to find the truth and hold that individual accountable.” 

Note that in this statement as it is framed Mueller does not explicitly state that President Trump is guilty of having obstructed or lied to investigators himself, but merely states this as a general principle, one that is no doubt true in general terms, but which he then juxtaposes with and implies references to President Trump to smear him indirectly without any evidentiary support for the contention

Mueller does not attempt in his report to give any convincing detailed evidence to back this statement up, and he doesn’t need to because it is not directly mentioning President Trump in the sentence, instead merely implying his guilt by the association, or perhaps more accurately, by juxtaposition

Any self-respecting, presiding judge in a trial where this was a statement made during proceedings, would have such a comment struck from the record as it would clearly be a prejudicial assertion, and is assuming facts not in evidence, something as an experienced and well-credentialed legal mind that Mueller would know only too well. 

Also: Would a man beyond reproach, during a thorough two year investigation: 

1. Allow Democrats with deep ties to the Obama administration and Trump’s presidential opponent, predominate in his office and allow some of them to become prime movers in the investigation, such as Andrew Weissman, Jeannie Rhee and Peter Strzok, without mentioning any potential conflict of interest or potential bias, whilst also crucially omitting from his final report the reasons that necessitated the removal from his team of Mr Strzok? 

2. Omit to mention that Joseph Mifsud, the source of the original claim to George Papadopoulos about Russian dirt, that then Papadopoulos regurgitated to Alexander Downer, had extensive ties to Western intelligence agencies?

3. Downplay the importance of the Steele dossier by failing to mention it at all in the Russian collusion section of the report? Had he done so, Mueller would then have had, in my view, to incorporate evidence that the dossier was formulated through collusion with Kremlin sources (therefore entirely germane, one would have thought, to the issue of Russian collusion) by the Democratic National Convention- a lie of omission to downplay the dossier’s origin, how and to whom it was disseminated, and what role it played in the collusion narrative and investigation. The dossier is mentioned elsewhere, but only in passing, in the obstruction of justice section.

4. Fail to mention Christopher Steele’s benefactor and Fusion GPS colleague, its founder Glenn Simpson, by name anywhere in the report, even though his involvement was relevant to proceedings? 

5. Fail to even mention that Natalia Veselnitskaya was granted, in spite of dubious associations, special entry into the US by the DOJ on 2 occasions in 2015, and then worked with Fusion GPS on behalf of Russian clients, and also met with Glenn Simpson of Fusion GPS the morning of the now famous Trump Tower meeting, as well as the night before and after it? 

6. Mention that the Trump Tower meeting might be in violation of campaign financing law, but omit that by that same metric, Hillary Clinton’s campaign through the Steele dossier were almost certainly even more obviously vulnerable to such potential violations? 

7. Try to find someone guilty of “obstruction”, in the absence of an actual crime to obstruct?

8. Omit from his report crucial details about many of the Russians mentioned (Oleg Deripaska for one) and their sometimes extensive ties to the FBI and other US intelligence agencies? 

9. Gloss over the possible conflict of interest and potentially improper actions taken by Rod Rosenstein, who was not only an active participant, but also a witness and more than likely a potential target of the investigation himself? The questionable behaviour of former FBI Directors James Comey and Andrew McCabe were also conspicuous by their absence.

10. Affirm that he wouldn’t be testifying before Congress, by saying that the report should be considered his testimony, and that he wouldn’t be saying anything beyond what’s contained in the report to Congress or anyone else? “I will not provide information beyond what is already public before Congress…the report is my testimony…I do not believe it would be appropriate for me to speak further”. The lack of candour should be obvious to anyone with eyes willing to see.

Robert Mueller also said that his lack of decision regarding any potential obstruction of justice was “informed” by the fact that the President couldn’t be charged. These are an example of “weasel words”, to avoid saying something that contradicted what he had said to Attorney General Barr on at least three previous occasions. 

You can be “informed” of something without it having any material effect on whether you recommend charges or not, and likewise say nothing about the merits or otherwise of those charges. As a lawyer, Mueller chooses his words VERY carefully– to convey an impression (often prejudicially) without flat out saying it. One would prefer a plain speaker who says what he thinks and means what he says. Robert Mueller does neither.

There is a good reason for constitutional protection of the POTUS. It is so as to avoid needless interference in the incumbent POTUS performing the duties required of him over trivial misdemeanours. Robert Mueller has effectively circumvented this protection with his highly prejudicial and carefully worded statements after his investigation came up empty-handed, ensuring that President Trump continued to be hamstrung until the next election, as was transparently his main aim and purpose. 

Mueller has, in my opinion, gone out of his way in partisan fashion, to portray President Trump in the worst possible light, using semantics and rhetoric, even when the evidence has been lacking for the alleged “crime” that the investigation was predicated upon. He has also gone out of his way to avoid looking into or incorporating into his investigation, elements of Russian collusion and interference in the election that definitely did occur from the Democratic National Convention and the Clinton campaign, copious evidence of which he could not fail to have come across but apparently was mysteriously hidden in plain sight, but then managed not to make the final cut of his report, conveniently protecting his fellow “swamp” dwellers.

For the record, I certainly view Trump as a narcissistic personality, and he is as impulsive and uncouth as many clearly perceive him to be. I also believe that both Donald Trump and Hillary Rodham Clinton were totally unfit to be appropriate candidates for the office of US President. In fact, Hillary Clinton’s handling of sensitive government emails through her private server whilst serving as US Secretary of State should have immediately disqualified her as a candidate. Deleting those emails, in defiance of a subpoena should quite possibly have seen jail time. Nothing President Trump has done before or since has come close to that level of illegality. 

I also don’t doubt that President Trump honestly believes that he was the victim of a witch hunt, and his actions since are purely congruent with that. Not submitting himself to cross examination by Mueller’s partisan investigators to avoid a perjury trap is also not obstruction, just sensible self-preservation, as is his right.

Mueller refers to “a subject” generically, but implies Donald Trump without naming him specifically. This has the effect of suggesting Trump did obstruct the investigation and lie to investigators when he hasn’t definitively done so. This is a very clever but entirely disreputable tactic, and avoids the tawdry requirement of actually being true, because he hasn’t openly said it, merely implied it. As a lawyer, who has seen more than his fair share of trials and courtrooms, Robert Mueller would be well aware of what a prejudicial comment is, and would jump on an opposing counsel who attempted to do that against him.

Robert Mueller hasn’t made a remotely convincing case for obstruction of any kind having occurred. He is, however, looking for any reason to keep the door open, and through clever wording achieved just that.

Independent, former Rolling Stone journalist Matt Taibbi opined in his recent opinion piece about “Russiagate” and the Mueller Report:

Russiagate isn’t just about bad reporting. It was and is a dangerous political story about rallying the public behind authoritarian manoeuvres in an effort to achieve a political outcome. 

Republicans who battered Mueller with questions weren’t wrong. Investigators in the Russia probe made extravagant use of informants abroad (in the less-regulated counterintelligence context), lied to the FISA court, leaked classified information for political purposes, opened the cookie jar of captured electronic communications on dubious pretexts, and generally blurred the lines between counterintelligence, criminal law enforcement, and private political research in ways that should and will frighten defence lawyers everywhere. 

Proponents cheered the seizure of records from Trump’s lawyer (Michael) Cohen, sending a message that attorney-client privilege is a voluntary worry if the defendant is obnoxious enough. The public likewise shrugged when prosecutors trashed Maria Butina as a prostitute, because Butina a) is Russian, and b) palled around with the NRA. 

This case has seen would-be liberals embracing guilt by association, guilt by nationality, guilt by accusation, entrapment, secret evidence, and other concepts that were considered an anathema to progressives as recently as the War on Terror period. 

In the name of preventing the “sowing of discord,” they’ve even embraced censorship. Finally, in an effort to milk the Mueller report for maximum effect, Democrats – ostensibly the party of card-carrying ACLU members – are trying to uphold a vicious new legal concept, “not exonerated. 

In a moment that provided a window into the authoritarian tendencies Mueller once expressed with more fluency, the Special Counsel declined under questioning by Ohio Republican Michael Turner to reject the idea that in our legal system, “there is not power or authority to exonerate.” This was equivalent to no-commenting a question about whether people are innocent until proven guilty

In America, prosecutors don’t declare you exonerated, you are exonerated, until someone proves otherwise. Efforts to reverse this understanding are dangerous, Trump or no Trump. It’s appalling that Democrats are backing this idea. 

All these excesses have been excused on the grounds that Trump must be stopped at all costs. But you don’t challenge someone for being racist and an enemy of immigrants, the poor, and the environment by turning the federal security apparatus into a Franz Kafka theme park. It’s fighting bad with worse.

As an exercise, then, let us instead rewrite Robert Mueller’s conclusion to his report to Congress for him, with an eye to clarity and fairness, rather than suggesting guilt where none exists. 

After an extensive and exhaustive investigation over a two year period, we found no significant or conclusive evidence that either President Trump or his campaign were involved in any active collusion with Russia or its operatives to influence the 2016 Presidential election. 

The evidence we did collect regarding the President’s intents and actions fell well short of that required to pursue any form of prosecution, and as such the President has no further case to answer in this regard. 

Given the requirements of natural justice, and the longstanding legal tenet of one being innocent until proven guilty, we also found there was insufficient evidence of any obstruction by the President of our enquiry, especially given that the President supplied all the written documentary evidence required of him. 

Given the importance of the President’s role in public office, we could not, in all conscience, continue our investigation any longer when the negative impact on the performance of his duties exceeded the potential benefit to be gained from any further pursuit of our inquiry.

Whilst it would be clearly impossible to fully prove the President’s innocence in such circumstances, given the nature of the accusations made, the absence of any conclusive evidence to implicate the President in any illegal activity effectively exonerates him of having committed any crime.

Finally, the idea that every half-baked accusation, no matter how frivolous, needs to be tested in a court of law, at often horrendous expense is ludicrous. 

President Trump had no obligation whatsoever to give Robert Mueller testimony over something he knew to be untrue. No one’s recall is perfect, and a misstep in giving evidence in not recalling a date accurately, or misquoting someone ever so slightly would open the gate for accusations of “misleading” investigators, even when no such intent existed. 

President Trump supplied documentary evidence requested of him, and did not fire or replace Mueller, allowing his probe to carry on to its conclusion, whereupon no evidence of collusion was found. The alleged obstruction instances would not, in my opinion, stand up in any court of law in the country, since “thoughtcrime” is not on the statute, nor is it illegal to vent anger or frustration at what one perceives to be unjust treatment or false allegations directed at you.

Robert Mueller’s manipulative use of language, to imply something unlawful without saying it, and then running for cover by not fronting Congress to answer hard questions about certain critical (and illuminating) omissions in his report, tells a different story to the narrative of his being above reproach and completely impartial.

Whenever questioning in the congressional hearings turned to the Steele Dossier, to the role of Glenn Simpson and Fusion GPS, the Democratic National Convention funding of the dossier, the use of Russian disinformation in the composition of the Steele Dossier (via the principle source Igor Danchenko), the role of various members of the FBI/CIA/DOJ in proceedings, and so on, it was met by the stonewalling statement from Robert Mueller that it was “not in my purview”.

Given much of this detail in question was directly relevant to the substance of his report, one has to ask the question why these facts were not in Robert Mueller’s “purview”, and whether the limitation of the scope of his investigation was deliberate in order to protect the Democrats, and certain members of the FBI, the CIA and the DOJ from scrutiny for their dubious behaviour in the lead up and the immediate aftermath of President Trump’s election.

The Democrat Aligned and Their Enablers- Actions Beyond the Pale:

The behaviour of the Democrats, and their sympathetic affiliates within the FBI and DOJ, leading up to and throughout the Trump Presidency, both in the “Russiagate” (Spygate) and “Ukrainegate” (Impeachment 1) issues particularly, has been every bit as “deplorable” and inappropriate as President Trump’s incessant Tweeting, not to mention his bombastic and sometimes casually derogatory and/or abusive language.

Then FBI Director James Comey, for example, definitely abused his privileged position in order to leak confidential information (internal memos) to the media against his own Commander-in-Chief, President Trump in a completely self-serving and totally inappropriate way, actions that were in violation of the expected conduct of his office. These criticisms were further confirmed by DOJ Inspector General Michael Horowitz’s report in 2018.

Similarly, House Intelligence Committee ranking member Adam Schiff’s behaviour, including allegedly lying to Congress about his prior contact with an alleged whistleblower, and then trying to coach witnesses in the first Trump impeachment hearings in order to protect Joe Biden and his son Hunter Biden from scrutiny, was also unequivocally appalling.

The Deputy Attorney General Rod Rosenstein did not recuse himself when he should have from the Mueller investigation where he was a potential witness and/or the possible subject of scrutiny, and furthermore in 2017 he suggested to FBI and Justice Department officials that he could secretly record President Trump whilst he was in the White House, an act that was also highly questionable at best, perfidious at worst, but certainly utterly disloyal and lacking in scruples.

Andrew McCabe, the Deputy Director of the FBI from February 2016 to March 2018, and Acting Director from May 9, 2017, to August 2, 2017 after James Comey’s dismissal by President Trump, was shown in the DOJ Inspector General’s report to have improperly authorised releases of information to The Wall Street Journal about an investigation into the Clinton Foundation, and then misled agents who questioned him about it on four occasions, three of which were under oath. In February 14, 2020, the Justice Department informed Andrew McCabe’s attorneys that it had declined to prosecute McCabe for these lies under oath, an action that would stand in stark contrast to the treatment of General Michael Flynn that I will soon outline below.

Going back to the time immediately prior to the 2016 Presidential election, US Secretary of State Hillary Clinton erased her email server (as I mentioned previously) using a “bleach bit” cleaner program to remove any trace of its contents, in direct contravention of a subpoena ordering her to produce these same contents into evidence in court. Even having a private email server for personal use, whilst she was officiating as US Secretary of State and routinely handling classified information, was at the very least careless and utterly irresponsible. Had President Trump been guilty of such an infraction, he would have been impeached in a heartbeat.

A clear pattern of inappropriate activity on the part of the Democrat aligned parties and their sympathisers in these proceedings, and the lengths to which the FBI and the DOJ went to cover up for, sweep under the carpet, or otherwise look the other way in pretending such matters were merely random careless errors of judgment, or of minor importance, is completely beyond the Pale.

The Strange Case of Lt. General Michael Flynn:

Lt. General Michael Flynn was U.S. National Security Advisor for the first 22 days of the Trump administration. He resigned in light of reports that he had allegedly “lied“ regarding the content of conversations with senior Russian diplomat Sergey Kislyak, when he was a member of the Trump administration transition team. Flynn was alleged to have mentioned to Kislyak that he hoped that any Russian response to US sanctions imposed by the Obama administration might be kept “proportionate”.

Flynn’s military career included a key role in shaping U.S. counterterrorism strategy and dismantling insurgent networks in the Afghanistan and Iraq Wars, and he was given numerous combat arms, conventional, and special operations senior intelligence assignments. He then became the 18th Director of the Defence Intelligence Agency in July 2012 under the Obama Administration, until his retirement from the military in August 2014.

Of note, prior to coming into this role in the Trump administration of US National Security Adviser, General Flynn had been highly critical of Barack Obama’s foreign policy in the Middle East and the Arab Spring during his time as DIA director, and he had publicly flagged a root and branch audit and reform of both the NSA and CIA. 

Some interesting facts regarding the protracted legal case against General Flynn by the DOJ:

The FBI interview that General Flynn gave at the centre of these allegations of making “false statements” wasn’t even memorialised verbatim at the time of the interview, but was edited later by Agent Peter Strzok, where it was altered dramatically from Agent Pientka’s notes. One should recall that this same Agent Peter Strzok, of “texting” fame (with his lover FBI lawyer Lisa Page about his desire to undermine the President), amongst his other partisan activities.

Tellingly: In one text, dated February 10, Strzok tells Page (FBI Lawyer Lisa Page) that he is heavily editing Pientka’s 302 form (witness statement form) to the point he’s “trying not to completely re-write” it. 

From Real Clear Investigations: 

Other messages reveal that Page, who did not attend the interview with General Flynn, reviewed the 302 form and also made editing suggestions. On February 14, Page texts Strzok, “Is Andy good with the 302?” – presumably referring to FBI deputy director Andrew McCabe. 

The next day, February 15, the Flynn 302 was officially submitted and filed with the FBI.

General Flynn’s responses were therefore not verbatim. Agent Pientka took some notes originally, and Agent Peter Strzok then heavily edited and altered the memorialisation of Flynn’s alleged responses to their questions, 3 weeks after the interview, when Strzok wasn’t the original note taker.

The question of General Flynn allegedly “lying” to the FBI seems to revolve around the small distinction between “mentioning” something in the call to Sergey Kislyak about Russian sanctions, as opposed to formally “discussing” sanctions or requesting anything specific from his Russian counterpart, something in the former instance that he, as a member of the Trump transition team, was perfectly entitled to do.

Interestingly, the contemporaneous notes from Flynn’s interview state that neither FBI agent thought Flynn was lying, but merely that he could not recall the full details of having discussed the issue with Kislyak.

General Flynn was ultimately pressured by the FBI to “cop a plea”, and plead guilty to “lying” to the FBI in order to avoid personal bankruptcy that a protracted legal case would entail (a tactic known as “lawfare“), and also to avoid having his own son (as was threatened explicitly by FBI agents at the outset) caught up in the same protracted legal wrangle that might ruin him also, when there was no actual crime or even predicate for the investigation that occurred in the first place. Bankruptcy and protection of one’s son from expensive litigation was clearly a powerful motivator for Gen. Flynn to plead guilty, when he is guilty of nothing more than being an active member of a Presidential transition team making preliminary diplomatic contacts that would be expected and necessary for any incoming administration.

An even bigger question is why the totally different consequences for FBI Deputy Director Andrew McCabe, who similarly admitted lying to the FBI, but apparently that didn’t warrant any charges being laid. McCabe’s lies were made under oath, whereas the informal interview undertaken by Strzok and his associate (memorialised 3 weeks after the fact as mentioned above), and without advising General Flynn that legal representation would be needed and should be present, could hardly be characterised in such a fashion.

The actions by Judge Emmett Sullivan, presiding in General Flynn’s court case, were also completely extraordinary, and in my view smacked of political motivation with the upcoming Presidential election in 2020 in mind. Judge Sullivan ruled the matter to be placed on hold to solicit amicus curiae briefs from third parties after the Justice Department made an application to drop the case against him. General Flynn’s new legal counsel (Sidney Powell) then asked the DC Circuit Court of Appeals to compel Judge Sullivan to drop the case, but her request was (of course) denied. On November 25, 2020, Flynn was then issued a presidential pardon by President Donald Trump. On December 8, 2020, Judge Sullivan dismissed the criminal case against General Flynn, stating that he would probably have denied the Justice Department motion to drop the case.

In a major victory for Michael Flynn, the United States Court of Appeals for the District of Columbia Circuit has ordered Judge Emmet Sullivan to grant the Justice Department’s request to dismiss the case against the former Trump National Security Adviser.

Upon consideration of the emergency petition for a writ of mandamus, the responses thereto, and the reply, the briefs of amici curiae in support of the parties, and the argument by counsel, it is ordered that Flynn’s petition for a writ of mandamus be granted in part; the District Court is directed to grant the government’s Rule 48(a) motion to dismiss; and the District Court’s order appointing an amicus is hereby vacated as moot, in accordance with the opinion of the court filed herein this date,” 

Decisions to dismiss pending criminal charges – no less than decisions to initiate charges and to identify which charges to bring – lie squarely within the ken of prosecutorial discretion.

The Judiciary’s role under Rule 48 is thus confined to “extremely limited circumstances in extraordinary cases.

But the Inquisition must go on……

“Hours after the US Court of Appeals for DC ordered Judge Emmett Sullivan to grant the DOJ’s request to drop the case, the retired ‘resistance’ judge (Gleeson) hired to defend Judge Sullivan’s actions, has filed a motion requesting an extension to file his findings against Flynn.” Clearly, Flynn had to be detained until after the 2020 Presidential election at all costs. 

In my view, this case as it played out in all its interminable glory, made General Flynn at this time the nearest thing to a political prisoner, and a pawn in a Deep State game to protect itself.

“Six Ways From Sunday” (hence the “Six No Trumps” of the title of this article) was the phrase Democrat Senate Leader Chuck Schumer applied to the means by which the intelligence community could victimise President Trump if he crossed them, but it was General Flynn who, more than most, would come to see precisely what that comment was all about. No crime committed, and therefore no predicate. No case to answer in any fair proceedings. Yet Flynn was kept in prolonged legal limbo to financially destroy him, and to irreparably ruin his reputation.

This General Flynn case demonstrates just how far we have gone down the rabbit hole into Wonderland, where everyone speaks in riddles and the laws of logic, or more precisely morality and legality no longer apply. The entire American political system, plus the DOJ, the FBI and the CIA is in dire need of a root and branch cull and overhaul. It has become, in my view, corrupt to its core, and President Trump was more the symptom of the rot at this core, rather than the problem itself. If anything, Trump’s elevation to the highest office in the land has shone a light on what has been lurking hidden in the dark corners of the corridors of power in the US over the last few decades.

The Obstruction of Justice Ambit To Keep Everything In-House:

From the very early days of the Trump Administration, when the Russiagate/Crossfire Hurricane melodrama of false allegations was at its peak, the hapless President was adamant that he wished to declassify all of the evidence of FBI, CIA and DOJ wrong doing, for the public to see just how lawless and unaccountable the “swamp” had become. This is where the series of overlapping investigations, from Inspector General Michael Horowitz’s Investigation and Report, to the Mueller Investigation and Report, to the Durham Investigation, and several more minor investigations in between gather a new, and enlightening context.

The motive for these investigations become clear when you understand that Attorney General William Barr, and his predecessor Attorney General Jeff Sessions before him, had advised President Trump that these ongoing investigations meant that such public disclosures, being material to an ongoing investigation(s), would constitute obstruction of justice if the President were to release them into the public domain, as he had expressed his desire to do so.

Once one understands the limitations this series of overlapping investigations placed on President Trump’s plans for total public disclosure, all becomes clear and falls into place. It becomes clear that AG Barr only pretended to do something to redress the Russiagate disinformation campaign, but he never had any intention of properly following through. The clue to this is in John Durham’s probe beginning in April 2019, but then not even interviewing one of the main players, CIA director John Brennan, until 24th August 2020, a full 16 months into the investigation, but by then with an election just over a mere two months away. At time of writing, this probe is still ongoing nearly 2 years into the Biden Presidency, with no sign of resolution, and with only one failed (and entirely half-hearted) court case to its credit.

Attorney General William Barr was there to shepherd Trump through to election defeat, making all the “right” noises to the President, but really he was just there for damage control to ensure a minimum of swamp drainage and reputational damage to any of the players in the Washington money-go-round from occurring. Having slow walked the John Durham probe into oblivion, AG William Barr had showed his partisan true colours. The “Deep State” clearly protect their own. 

Whilst often consigned to the realms of conspiracy theory (conveniently), there clearly is a “Deep State” that acts as a form of gatekeeper shadow government in the US, and there has been for decades. President Trump was a mere babe in the woods when he wallowed into this swamp, and he was arrogant (and naive) enough to think he could “drain the swamp“, and disrupt the cushy little network of closely aligned friends, colleagues and fellow travellers that these Deep State players had set up for themselves. 

They have outed themselves as only interested in the pursuit of power (and back handed deals to enrich themselves in the process), and the welfare of the USA and Americans in general is clearly irrelevant to them. President Trump wasn’t smart or savvy enough (but then, who is?) to overcome an enemy that comprised a large majority of those prominent figures within his own administration. Donald Trump repeatedly put his faith in people who, from day one, were working to undermine and ultimately neutralise him, and his stated agenda.

The Mueller Investigation and Report were absolutely premium delaying tactics in a similar vein, with one minor operative in FBI Lawyer Kevin Clinesmith given a slap on the wrist only (one year of probation and 400 hours of community service) for doctoring an email to mislead the FISA Court, whilst several others were given a free pass, like Deputy Director of the FBI Andrew McCabe, who (as stated earlier) admitted to lying to the FBI investigation about a Wall St Journal leak. 

President Trump, in his “bull in a China shop” fashion, disrupted the Washington three ring kickback circus, and they had “Six Ways From Sunday” to neutralise, bamboozle and ultimately get rid of him. And they will continue to make him pay onwards into the future, such are the tangled webs of plots and schemes they have at their disposal should he ever threaten them again. Whilst I have to admire President Trump for his tenacity and perseverance, I nevertheless fear for his safety, up to and including assassination, if push came to shove. I would put it to you that there is nothing they would not stoop to in the name of self-preservation against the threat that Trump represents.

Trump’s personal and moral failings are legion. In getting rid of this rogue outsider that had made himself a threat to the corrupt Washington gravy train that politicians and lobbyists had made for themselves, the integrity of every pillar of the American democracy has been beaten into submission, to the extent that half the US civilian population is effectively at war with the other half, and faith in the mainstream media and in judicial integrity is at an all time low.

The Obama Years: Lost Opportunities, the Weaponisation of the Putative 4th Branch of Government, and the Abrogation of the Peaceful Transition of Power:

President Barack Obama’s economic credentials over the 8 years of his administration, have been undervalued and under-appreciated by those politically opposed to him. Having inherited the immediate post GFC economy at the outset of his administration, the early days of his first term were bound to be difficult economically, and whatever successes he had were clearly overshadowed by mass bail outs, TARP (The Troubled Asset Relief Program-a program to purchase toxic assets and equity from financial institutions) and ZIRP (Zero Interest Rate Policy), and “Quantitative Easing” (i.e. money printing) to the moon and back.

By the same token, some of the post Global Financial Crisis recovery was bound to happen regardless of any policies on President Obama’s part. The recovery was slow but steady, and I would give Barack Obama a pass mark or better on that front.

The main disappointing feature of Barack Obama’s presidency, however, was that he promised so much “hope and change” but delivered so little in response, particularly on bridging the ever-widening racial divide as the first African American President. That President Obama did so little to unify the country, and was so often stoking those divisions in some of his commentary on certain events (Trayvon Martin’s death, actions in the wake of the 2014 police shooting in Ferguson, Missouri) that occurred during his two terms was particularly disappointing.

President Obama was a particular failure, however, on the foreign policy front, where having initially done the right thing by not abruptly pulling out of the engagement in Iraq precipitously, he eventually did just that thereby paving the way for the expansion of the Islamic State terrorist group (ISIS). “Red lines” that were crossed in Syria, the Libyan debacle and the entire Arab Spring was at least in part such a disaster due to his weak leadership, as well as to the “highly qualified ” mistress of disaster in Mrs Hillary Rodham Clinton as his Secretary of State, who was soon followed by the hopelessly inept John Kerry.

But, at least this was the moment when “the seas subsided and the world began to heal“.

President Trump inherited a fairly moribund economy in 2016, one that grew only from the basement in rebound fashion from the depths of the Global Financial Crisis (GFC) in 2008 under President Barack Obama. What President Donald Trump achieved economically prior to the COVID pandemic debacle was truly remarkable, however, particularly in reducing red tape for business expansion, delivering energy independence for the US for the first time in decades, and especially in providing record employment opportunities (the unemployment rate reached 3.5 percent, the lowest in a half-century), and this was particularly evident in disadvantaged communities, as unemployment rates for African Americans, Hispanic Americans, Asian Americans, Native Americans, veterans, individuals with disabilities, and those without a high school diploma all reached record lows under his tenure.

However, as a handbrake on the economy, nothing compares to the COVID19 pandemic. No lockdowns were required during the Global Financial Crisis, no block or major reduction in international travel was required (80-90% reduction in international travel was the result of the COVID pandemic), no contraction in international trade to such an extent was enacted, and no 60% reduction in footfall in the retail space. There is no parallel between the GFC that Obama had to navigate in recovery, and the economic ruin wrought by complete lockdowns of entire economies in response to the COVID 19 pandemic with which President Trump had to deal.

The Patriot Act, the DOJ-NSD and the Targeting of Political Adversaries:

It was the Republicans, under President George W Bush who created the Patriot Act, the Dept of Homeland Security (DHS), and the Office of the Director of National Intelligence (ODNI), in response putatively to the 9-11 Twin Towers terrorist attacks. This act, whether intentionally or not, laid the foundation for domestic surveillance in the US, where the focus shifted from external threats to national security to those inside the nation, where “We The People” became a threat to be monitored, by increasing the tools of power and control at the disposal of intelligence agencies.

These same agencies were to later be weaponised by President Barack Obama with the creation of the Dept of Justice National Security Division (DOJ-NSD). Barack Obama and Eric Holder did not create a weaponised Dept of Justice and FBI; the institutions were already weaponised by the Patriot Act. What President Obama and his Attorney General Eric Holder did was to take the preexisting system and retool it, so the weapons of government conducting surveillance only targeted one side of the political continuum.  “Domestic terrorists” were now defined through the prism of political opposition.

In 2010, President Obama launched a joint, covert DOJ/FBI and Internal Revenue Service operation to target the TEA Party for IRS Audits, after the midterm “shellacking” caused at least in part by the backlash against Obamacare. Mitch McConnell, Republican Minority Leader in the Senate, supported this targeting of the TEA Party because his Senate colleagues were getting being “primaried out“ by an effective grassroots campaign by TEA Party candidates.

The peasants were clearly revolting…. so a visibly angry Mitch McConnell desperately made a deal with the devil to protect himself and the establishment Republicans. There are striking similarities between this TEA Party movement in 2010, and the 2015 MAGA movement under Donald Trump. When President Trump came into office in 2017, he met the same congressional opposition as the successful TEA Party candidates did in 2010.

President Barack Obama weaponising the IRS in this way to target his political rivals, specifically the TEA Party on this occasion, was clearly a low point in US politics, and an unfortunate precedent that would resonate into the Trump era and beyond.

From the New York Times:

In October 2017, the Trump Administration agreed to settle a lawsuit filed on behalf of more than four hundred conservative non-profit groups who claimed that they had been discriminated against by the Internal Revenue Service for an undisclosed amount described by plaintiffs’ counsel as “very substantial.” 

The Trump Administration also agreed to settle a second lawsuit brought by forty-one conservative organizations with an apology and an admission that subjecting them to “heightened scrutiny and inordinate delays” was wrongful.

And of further note, in a similar vein but this time to address the threat of MAGA Republicans:

From the New York Times

(President Barack) Obama wanted one question answered clearly, in writing, before he left office: What did the Russians do during the 2016 election?

It was a difficult task, on a very tight timeline. Organizing the effort fell to Gen. James Clapper, his salty 75-year-old Director of National Intelligence, who often handled the president’s daily intelligence briefing personally.

To meet Obama’s request, Clapper pulled together intelligence gathered by the C.I.A., the F.B.I. and the N.S.A. A team of about 30 analysts worked through the holidays to meet Obama’s deadline.

There were some differences of opinion about what to include and how much confidence to assign to one main conclusion. But when Clapper walked into the Oval Office at 1 in the afternoon on Jan. 5 — along with the C.I.A. Director, John Brennan; the N.S.A. Director, Adm. Michael Rogers; and the F.B.I. Director, James Comey — the group had carefully rehearsed their presentation so they could speak with a single voice.

Gathered around the room were (President) Obama, Vice President Joe Biden and their senior advisers, as well as officials from the Department of Justice and the National Security Council. The four intelligence chiefs sat across from one another on two couches. One by one, they turned to address Obama.

So much for the peaceful transition of power!

As the Mueller report noted, there was no actual evidence of any “Russian Collusion”. None. It was a pretext to interfere with the peaceful transition of power to a POTUS they feared. The over-riding trigger to act was Trump signalling potentially a thawing of relations with Russia, something intolerable to the Neocon hierarchy in Washington. But, an even more pressing reason was that President Trump, with General Mike Flynn as his point man, was a real and present threat to the cushy little financial arrangements and institutionalised graft that make the Washington “swamp” so contemptible.

And, just for good measure, to rig the game for even greater political partisanship:

Via Reuters 

“When President Barack Obama entered the White House in 2009, the federal appeals court based in Virginia was known as one of the most conservative benches in the country. 

Two Obama terms later, Democratic appointees hold a 10-5 majority on the 4th U.S. Circuit Court of Appeals, a panel of which issued a groundbreaking ruling this April backing transgender rights. 

The shift to the left on the court, which hears cases from Virginia, Maryland, West Virginia, South Carolina and North Carolina, highlights a widely overlooked aspect of Obama’s legacy. 

His appointments of dozens of judges to the country’s influential federal appeals courts have tilted the judiciary in a liberal direction that will influence rulings for years to come and be further entrenched if Democrat Hillary Clinton wins this November’s presidential election.”

Hypocrisy is clearly a non-partisan issue. The political Left are literally bragging about President Obama stacking the Appeals Courts with Liberals, yet fast forward to 2018 and 2020 and President Trump allegedly stacking the US Supreme Court (SCOTUS) with 2 conservatives (Brett Kavanaugh and Amy Coney Barrett) now becomes the epitome of evil?

The Failure of Hillary Clinton- That “Glass Ceiling” Remains Stubbornly Intact:

Hillary Rodham Clinton was an extremely poor and compromised Presidential candidate for the 2016 election, with her use of her personal email to conduct personal business whilst simultaneously handling state secrets on the same server, whilst she was US Secretary of State, should have disqualified her immediately from public office. The US Secretary of State is one of the highest offices in the US government, and is ultimately responsible for foreign policy and is privy to most US state secrets, the disclosure of which may well present a real and present danger if mishandled, and one which threatens potential damage to both national security and the national interest

Trump’s narcissistic personality and the distinct lack of temperance in his comments and demeanour should have prevented him from being elected as President, even though some aspects of his stated agenda, particularly in wanting to revitalising US manufacturing and industry, and removing the regulatory noose strangling the economy, were attractive to many.

Such were Hillary Clinton’s personality flaws, however, and the negative connotations of her various past associations, statements and actions, each of which ultimately hindered her effectiveness as Trump’s election adversary, allowed the unthinkable to happen, and the glass ceiling that was “predestined” to be shattered by her ascension to the throne remained pristine and intact after a humiliating and altogether unexpected loss. “What happened?” (as her book about the causes of the election loss was ironically titled, except without the qualifying question mark added for poetic license) indeed.

Whilst Donald Trump was articulating the case for the working poor in “fly over states”, Hillary Clinton and the Democrats conspicuously sought to govern solely for the East and West Coast, the Hollywood elite and the Globalist vulture class, whilst the remainder of the country, the working class and their aspirations for a better life and a shared prosperity in particular, were conspicuously absent, their being far too “deplorable” to be worthy of even the slightest consideration.

When “the Big Lie” Might Possibly Be True:

Most. Secure. Election. Ever.

Cynics might suggest instead that the 2020 US Presidential election was so “secure”, in fact, that the result was secured even before the first vote was cast.

I personally make no definitive assertion whatsoever that the election result was materially altered by the irregularities that did undoubtedly occur. However, there were unquestionably some highly irregular instances that have been shown to have occurred during the 2020 election that are difficult to dismiss out of hand, “computer glitches” (e.g Antrim County) that only favour one candidate for example, and there are hundreds of sworn affidavits by witnesses, and copious video and documentary evidence that requires wholehearted investigation and thorough explanation, at the very least. 

This was not done to most people’s satisfaction, and gives the general impression of mendacity, caginess and obfuscation. 

Anyone who is concerned about the potential for the overthrow of our democratic systems would insist on a forensic audit of voting machines and signature verification of votes, and also deplore (and seek redress from) Big Tech companies for censoring of only one side of the political divide in the immediate pre-election period.

It is indeed somewhat amusing how the Left leaning people in the US are so sure that election fraud didn’t occur, yet seem so afraid of actually proving it. 

To show President Trump up as a liar, would only require the legislators and electoral officials to lay their cards completely on the table. A forensic and objective audit addressing all those allegations that have been made, would consign the former President to eternal disgrace and convince those taken in by those false allegations that their fears were without foundation. What better way of debunking those claims than by proving it beyond any possible doubt?

Just as an innocent man accused of a crime can’t wait for the opportunity to submit to a lie detector test to prove his innocence, so should those who stand accused be champing at the bit to submit to independent and transparent vote by vote forensic examination to prove their accusers wrong.

I am someone who values the integrity of the pillars of democracy, but refuses to allow it to be sacrificed merely to get rid of an allegedly “unfit” POTUS. Once the precedent of a compromised election is established, free and fair democratic elections will be nothing but a distant memory. 

Governments of all political stripes are completely unworthy of such unfettered power that an insecure electoral process provides, and they must always be answerable to the people at the ballot box. 

I believe that the US electoral system’s fragility is now there for all to see:

From Time Magazine:

That’s why the participants want the secret history of the 2020 election told, even though it sounds like a paranoid fever dream–a well-funded cabal of powerful people, ranging across industries and ideologies, working together behind the scenes to influence perceptions, change rules and laws, steer media coverage and control the flow of information. 

They were not rigging the election; they were fortifying it. And they believe the public needs to understand the system’s fragility in order to ensure that democracy in America endures.

Or to summarise: Democracy (TM): Fortified for your protection. 

It’s wonderful (at least for some) that this cabal of powerful people sought to sanitise the United States of America in this way, much as a luxury hotel chain places a toilet seat band to signify to guests that every endeavour has been made to sanitise a toilet for the good of its customers.

The allegations of electoral fraud, once made by a sitting President, cannot be “unmade”, regardless of whether or not this result was materially influenced to the extent of overturning the result .

They needed to be dealt with openly, transparently, and forensically down to the last dotted “i” and crossed “t”. 

Then, if as many seem to allege, those allegations are without foundation, then that would be proven to the satisfaction of most of the 70+ million voters who still believe that they were defrauded of the 2020 election. 

Failing to do so has left a fatally divided nation, one where “healing” of divisions cannot conceivably happen, in the term of this current President at the very least, indeed if ever. One could be forgiven for thinking that some people in power in the US desire exactly that outcome.

This is therefore a very dangerous time in US history. Had the allegations of election fraud been handled differently, and proven false rather than waved away on procedure by the Courts, what potentially comes next could have been avoided.

And if, in the unlikely event that widespread and systemic fraud had been proven, then the integrity of future elections would have been strengthened with safeguards against it ever happening again. 

The fault for what potentially follows lies in all those who attempted to sweep allegations of election fraud like this under the carpet. If unfounded allegations are made, then the only way to restore confidence in the process is to be free and open to forensic and objective investigation. 

Once that is done, in the full glare of the media spotlight, the truth should come out, and if false then the perpetrators of those allegations are seen for what they are. 

Even if this occurs, but the media “run dead” on the story having presupposed the conclusion that suits their political inclinations, there will still remain lingering doubts because people can see when the media tries to bury a story they don’t like. 

The mainstream, legacy media are indeed “One-eyed Jacks“, and are a major “wild card” component of the overall problem. They believe they are being smart and subtle in their duplicity and deceit, when in fact the public are not nearly as blind, nor as stupid as the media cognoscenti would like to pretend they are.

One Eyed Jacks Have Little Heart and Never Call a Spade a Spade (Source: Wikipedia)

Conclusion:

Many people focussed on and worried about another 4 years of a “bad” President in the shape of Donald Trump. In so doing, however, they have instead facilitated a raft of very bad precedents, that won’t just last 4 years, but will likely be eternal. 

The era of free and democratic elections has ended, in my view, regardless of whether or not this 2020 election result was influenced to the extent of overturning the result, because the integrity of all future elections has been fatally and permanently compromised. 

In attempting to destroy Donald Trump, the Deep State players destroyed the democratic processes that existed up till now instead. All elections in the US from now on may well be merely for show- preordained and manipulated for a desired outcome. Some issues are far more important than one single venal and arrogant President, no matter how deficient in some areas he may well be. 

Four more years of President Trump, whatever his deficiencies, is worth enduring to preserve something fundamentally more important than him. Sadly, many are seemingly willing to sacrifice the whole to eliminate a tiny element of that whole. 

Whilst Trump clearly makes a convenient scapegoat in some quarters for all that is wrong in society at present, the fact is that he is the symptom not the cause. His brand of narcissistic populism is the natural reaction to a corrupt political system populated by chancers and bounders.

Currently, we have a political class and elite oligarchy (in the US in particular but by no means exclusively) that views the common people with contempt and disdain, pillorying them as racist, bigoted and stupid and beneath their lofty position to even consider their aspirations or needs. 

The poor are seen politically as merely a voting bloc, and their welfare begins and ends with how they can best be exploited by pretending to care for and nurture them.

Then, we have the political activist groups that have reached critical mass in the West, whose aim is not to improve society but destroy it; tearing down its institutions with not a plan or an idea of what could possibly replace them.

The damage that has been done is almost certainly irreparable, and stems from the actions of all parties involved, including those covering up for or denying evidence that is in the public domain. The US is split right down the middle, completely divided and diametrically opposed to one another, believing the other side is the epitome of evil

The confidence the people had in the impartiality and integrity of institutions like the Department Of Justice, the FBI and the judiciary has been eroded beyond repair, in part because of their own clearly partisan behaviour. 

Not one person on either side of this debate can remain impartial or objective, and the unheeded call for a forensic audit of the 2020 election in those affected swing states was clearly necessary to restore confidence in the system of governance. Once allegations of this magnitude are made, regardless of merit, they have to be pursued vigorously and transparently. But this has not happened, “because Trump”.

For now, we will have to satisfy ourselves with this sad refrain: “The swamp is dead, long live the swamp!”

Karl Marx and the Politics of Envy


Karl Heinrich Marx was born in Trier, Germany on May 5, 1818 and died in London, England on March 14, 1883 at the age of 64. After studying Law and Philosophy at Universities in Bonn and Berlin, he would come to global renown and influence on the basis of his “critical theories” that sought to scientifically examine society, economics and politics, formulating theories and ideas that would come to be known as Marxism. This doctrine emphasised the role of class conflict in the development of human societies.

Marx was primarily influenced by the Dialectical Idealism of Georg Wilhelm Friedrich Hegel, which was defined as “an attempt to explain the evolution of Western society through the use of dialectical forms, which rely upon the presumed, motive power of spiritual, mental, or ideal forces.” Marx rejected the idealism at the core of Hegel’s theory, and adapted it instead to formulate what might be termed Dialectical Materialism, a philosophy of science, history and nature that emphasises the importance of real-world conditions and the presence of contradictions within things, in relation to but not limited to class, labour and socio-economic interactions. 

Marx met his lifelong collaborator and sometime patron, Friedrich Engels, in 1844 at the Café de la Régence in Paris, whereupon Engels showed him his recent publication entitled “The Condition of the Working Class in England in 1844”, which convinced Marx that the working class would be the agent and instrument of the final revolution in history, leading Marx to launch into intensive study and critical appraisal of the foundational writings on political economy, theories formulated by such luminaries as Adam Smith, David Ricardo and James Mill.

Shortly thereafter, Marx came to an association with a secretive organisation of like-minded radicals, the League of the Just–  a Christian communist international revolutionary organisation, founded in 1836 by branching off from its ancestor, the League of Outlaws.

From Wikipedia:

“Marx thought the League would be just the sort of radical organisation that was needed to spur the working class of Europe toward the mass movement that would bring about a working-class revolution. However, to organise the working class into a mass movement the League had to cease its “secret” or “underground” orientation and operate in the open as a political party.”

This party was renamed the Communist League, with a political pamphlet written by Engels and Marx (that came to be better known as “The Communist Manifesto“) as the foremost wellspring for its organisational protocols and principles- a detailed programme for action.

The Communist Manifesto: by Karl Marx and Friedrich Engels (source: Booktable.com)

Having worn out his welcome in Belgium due to his political rabble rousing, Marx then moved to Cologne where he started the publication of a daily newspaper called the “Neue Rheinische Zeitung“, which he helped to finance through a recent inheritance from his father. The paper was designed to put forward news from across Europe with his own Marxist interpretation of events, and the newspaper featured Marx as its primary writer, and the dominant editorial influence.

Despite some contributions by fellow members of the Communist League, according to Friedrich Engels it remained, interestingly, “a simple dictatorship by Marx”. Reactionary counter-revolutions in both Germany and France led Marx to leave for England, where he would live in exile for the rest of his life. It was here that he would write his major, most influential works: “A Contribution to the Critique of Political Economy“, the 3 volume “Das Kapital” (his magnum opus), and “Theories of Surplus Value“.

Das Kapital (Capital): Karl Marx’s Three Volume Magnum Opus (from 12min Blog.com)

Whilst a thorough critique of these undoubtedly influential works is somewhat beyond the scope of my article, the general ideological underpinnings of Marxist doctrine, its inherent contradictions and failings, and Karl Marx’s moral deficiencies as a human being will instead be where I intend to place most of my emphasis.

Criticisms of Marxist Theory:

These criticisms of Marxist doctrine fall into a number of broad categories:

Historial Materialism– Marx locates historical change in the rise of class societies and the way humans labour together to make their livelihoods. For Marx, and his collaborator Engels, the ultimate cause and moving power of historical events are to be found in the economic development of society, and the social and political upheavals wrought by changes to the mode of production. Marx tries to draw an oversimplified distinction between these changes in society in isolation (“the base”) from what he refers to as the “superstructure” of Law, Politics, the Arts and Literature, Morality and Religion, as though these important elements of any functioning society are not motive force in their own right.

Historical Determinism– Marxism, in its theoretical foundation, designates a rigid finalist and mechanist conception of historical unfolding that makes the future appear as an inevitable and predetermined result of the past. Yet, ironically this unfolding of history as predicted under Marx’s original theory has not even remotely come to pass. His much hoped for revolution of the masses failed to eventuate as the industrial revolution, and the capitalist free market economy he despised, promoted their financial opportunities and wellbeing to such an extent that the Marxist polemic could not gain any traction among its intended recipients, but instead became a much beloved curio in the bric-à-brac cabinet of the lofty academic classes, who continued to adhere to it even in the face of its many failings.

Suppression of human rights– In the absence of a free market composed of individual choices being formulated by all the components of the system, and in a constant state of flux due to that freedom of choice that produces limitless possibilities, a communist state would, by its very nature, tend to erode the rights of its citizens and become increasingly authoritarian. Marx postulated the necessity of a violent revolution, and a dictatorship of the proletariat, which by its very nature would promote significant social upheaval and endanger (or worse) large numbers of innocent lives. Being collectivist by nature, and reliant on “the masses” rather than seeing the working class as unique individuals, Marx’s doctrine must necessarily infringe on the rights of the individual to self determination, as all must be subservient to an often nebulous “greater good”.

Economic criticisms

a) Labour theory of value– British economist Alfred Marshall was quoted as saying:  “It is not true that the spinning of yarn in a factory … is the product of the labour of the operatives. It is the product of their labour, together with that of the employer and subordinate managers, and of the capital employed“.

Marshall similarly criticised the Marxian theory of value through the law of supply and demand. According to Marshall, price or value is determined not just by supply, but by the demand of the consumer. Labour does contribute to cost, but so do the wants and needs of consumers. The shift from labour being the source of all value to subjective individual evaluations creating all value undermines Marx’s economic conclusions and some of his social theories.

Marxism also fails to account for the “value” added to businesses through employers reinvesting some or all of their profits back into their business in order to grow and expand those businesses or to make them more efficient, and with that increased profitability that investment entails can then feed back to better wages and other rewards for their employees.

b) Reduced incentives– The notion that an absolute equality of reward for one’s toil across the board in complex societies where working roles, responsibilities, dangers and skill levels vary so widely completely underestimates the degree to which varied remuneration acts as a motive force to performing intellectually difficult, taxing, physically dangerous or unpleasant tasks that would simply be avoided should all labour, regardless of these factors, be rewarded equally. Whilst primitive hunter-gatherer societies may have been able to function in some rudimentary way (at the point of a spear almost certainly), the relative similarities in the duties required of the various members of the tribe makes such a comparison to a complex, multilayered modern society a moot point at best.

It is the common error of Socialists to overlook the natural indolence of mankind; their tendency to be passive, to be the slaves of habit, to persist indefinitely in a course once chosen. Let them once attain any state of existence which they consider tolerable, and the danger to be apprehended is that they will thenceforth stagnate; will not exert themselves to improve, and by letting their faculties rust, will lose even the energy required to preserve them from deterioration. Competition may not be the best conceivable stimulus, but it is at present a necessary one, and no one can foresee the time when it will not be indispensable to progress.” John Stuart Mill

This hope [that egalitarian reward would lead to a higher level of motivation], one that spread far beyond Marx, has been shown by both history and human experience to be irrelevant. For better or worse, human beings do not rise to such heights. Generations of socialists and socially oriented leaders have learned this to their disappointment and more often to their sorrow. The basic fact is clear: the good society must accept men and women as they are.” John Kenneth Galbraith

c) Having distorted or absent price signals

Both Ludwig von Mises and Freidrich Hayek have argued that the free market solution is the only possible solution to resolve the economic calculation problem and, without the information provided by market prices, Marxist socialism inherently lacks a method to rationally allocate resources. In practice, as though to prove this point, centrally planned Communist states (like the Soviet Union for example), have attempted and used mathematical techniques to determine and set prices, with generally discouraging to catastrophic results.

d) Showing internal inconsistency

Karl Marx’s value theory and law of the tendency of the rate of profit to fall are internally inconsistent. In other words, the critics allege that Marx drew conclusions that actually do not follow from his theoretical premises. Once those errors are corrected, Marx’s conclusion that aggregate price and profit are determined by—and equal to—aggregate value and surplus value no longer holds true. This result calls into question his theory that the exploitation of workers is the sole source of profit.” (Wikipedia)

e) Lack of relevance to the modern world- John Maynard Keynes opined that “Das Kapital” was:

an obsolete textbook which I know to be not only scientifically erroneous but without interest or application for the modern world.”

While Robert Solow wrote:

Marx was an important and influential thinker, and Marxism has been a doctrine with intellectual and practical influence. The fact is, however, that most serious English-speaking economists regard Marxist economics as an irrelevant dead end.

Marxism claims to be founded in the real world, but that is precisely where it founders: in its implementation, and also in its inability in accurately representing real world economic application.

Marxism: The Many Headed Hydra

The Hydra in Greek Mythology, was a gigantic water-snake-like monster with nine heads, one of which was immortal. Anyone who attempted to behead the Hydra found that as soon as one head was cut off, two more heads would emerge from the fresh wound. The destruction of the Hydra became one of the 12 Labours of Hercules.

Orthodox Marxism is, in many ways precisely akin to that mythical beast, in that no matter how many times it is refuted, or shown to be in error, it simply morphs into another manifestation of itself, whether it be Marxist-Leninism, Trotskyism, Libertarian Marxism, Stalinism, Maoism, Western Marxism, Marxist Humanism or Austromarxism. Ultimately, it has transformed in the current era into “Cultural Marxism”, a doctrine that has insinuated itself into every corner of the social fabric that had previously bound people in Western democracies together.

Many notable academics such as Karl Popper, David Prychitko, Robert.C.Allen, and Francis Fukuyama have argued that the majority of Karl Marx’s predictions have failed. Marx predicted that wages would tend to depreciate and that capitalist economies would suffer worsening economic crises leading to the ultimate overthrow of the capitalist system. The socialist revolution would allegedly occur first in the most advanced capitalist nations and once collective ownership had been established then all sources of class conflict would disappear. Instead of Marx’s predictions, communist revolutions took place in undeveloped regions in Latin America and Asia, instead of industrialized countries like the United States or the United Kingdom.

Popper has also argued that both the concept of Marx’s historical method as well as its application are fundamentally unfalsifiable and thus it is a pseudoscience that cannot be proven true or false: 

The Marxist theory of history, in spite of the serious efforts of some of its founders and followers, ultimately adopted this soothsaying practice. In some of its earlier formulations (for example in Marx’s analysis of the character of the ‘coming social revolution’) their predictions were testable, and in fact falsified. Yet, instead of accepting the refutations the followers of Marx re-interpreted both the theory and the evidence in order to make them agree. In this way they rescued the theory from refutation; but they did so at the price of adopting a device which made it irrefutable. They thus gave a ‘conventionalist twist’ to the theory; and by this stratagem they destroyed its much advertised claim to scientific status.”

Popper believed that Marxism may have been initially scientific, in that Marx had postulated a theory which was genuinely predictive. When Marx’s predictions were not in fact borne out, Popper argues that the theory was saved from falsification by the addition of ad hoc hypotheses which attempted to make it compatible with the facts. By this means, a theory which was initially genuinely scientific then degenerated into pseudoscientific dogma.

Popper also devoted much attention to dissecting the practice of using the dialectic in defence of Marxist thought, which was the very strategy employed by V.A. Lektorsky in his defence of Marxism against Popper’s criticisms. Among Popper’s conclusions was that Marxists used dialectic of side-stepping and evading criticisms, rather than actually answering or addressing them.

Hegel thought that philosophy develops; yet his own system was to remain the last and highest stage of this development, and could not be superseded. The Marxists adopted the same attitude towards the Marxian system. Hence, Marx’s anti-dogmatic attitude exists only in the theory and not in the practice of orthodox Marxism, and dialectic is used by Marxists, following the example of Engels’ Anti-Dühring, mainly for the purposes of apologetics – to defend the Marxist system against criticism. As a rule critics are denounced for their failure to understand the dialectic, or proletarian science, or for being traitors. Thanks to dialectic the anti-dogmatic attitude has disappeared, and Marxism has established itself as a dogmatism which is elastic enough, by using its dialectic method, to evade any further attack. It has thus become what I have called reinforced dogmatism.

Economist Thomas Sowell wrote in 1985: 

What Marx accomplished was to produce such a comprehensive, dramatic, and fascinating vision that it could withstand innumerable empirical contradictions, logical refutations, and moral revulsions at its effects. The Marxian vision took the overwhelming complexity of the real world and made the parts fall into place, in a way that was intellectually exhilarating and conferred such a sense of moral superiority that opponents could be simply labelled and dismissed as moral lepers or blind reactionaries. Marxism was – and remains – a mighty instrument for the acquisition and maintenance of political power.

Karl Marx, the Man:

Paul Johnson, in his book “Intellectuals” has this to say about Karl Marx as a man:

“….It must be said that he developed traits characteristic of a certain type of scholar, especially Talmudic ones: a tendency to accumulate immense masses of half-assimilated materials and to plan encyclopedic works which were never completed; a withering contempt for all non-scholars; and extreme assertiveness and irascibility in dealing with other scholars. Virtually all his work, indeed, has the hallmark of Talmudic study: it is essentially a commentary on, a critique of the work of others in his field.”

He continues: “The truth is, even the most superficial inquiry into Marx’s use of evidence forces one to treat with skepticism everything he wrote which relies on factual data”. For example, Johnson stated: “The whole of the key Chapter Eight of Capital is a deliberate and systematic falsification to prove a thesis, which an objective examination of the facts showed was untenable.”

Marx was born into the early decades of capitalism, and the Industrial Revolution that fuelled it, which began to create a burgeoning middle class (“the bourgeoisie”), and even worse a thriving nouveau riche, which the elite old guard, and even intellectuals like Marx despised equally, and with a passion. They all despised the very existence of a middle class of aspirational people, and it was this hatred, not love nor even sympathy for the proletariat, that drove the formulation his Marxist theory.

Marxism becomes, in my view, merely a secular and collaborative form of feudalism, inevitably devolving into a kleptocracy, where the apex of a cleverly conceived pyramid scheme bleeds all the other levels white, dragging them down to a base marked by equal shares of misery and privation. 

It is also unlikely that Karl Marx ever stepped foot in a factory in his whole life, in fact he rebuffed Engels’ invitations on many occasions, and many of the facts he utilised in Das Kapital were from decades old government reports that didn’t reflect on what present working conditions were like, and he was loose with the truth about improving wage conditions of factory workers by asserting the exact opposite. Perhaps the only truly exploited labourer that he knew personally was his maid, whom he impregnated, forcing the resultant child into an orphanage and to whom he never paid a “red cent” for the domestic chores she performed in the Marx’s home. 

Since intellectuals like Marx understanding of the world of politics and of the behaviours and motivations of real people are fatally inadequate, they tend to succumb to the temptation to ignore real people and the real world, before putting their ideas before the people. When people fail to react in the desired manner, these intellectuals become embittered and increasingly extreme in their behaviours and attitudes. In this respect, Marx was the quintessential exemplar of this truism.

Believing that they have all the answers, intellectuals of Marx’s ilk convince themselves that they do not need to bother with troublesome distractions like hard facts, and that they are justified in lying in the service of the higher truths that only they, in their enlightened wisdom, have glimpsed.

Johnson summarizes four aspects of Marx’s character: his taste for violence, his appetite for power, his inability to handle money and, above all, his tendency to exploit those around him.

and further,
Marx lived his life in an atmosphere of extreme verbal violence, periodically exploding into violent rows and sometimes physical assault.”

Another perspective worthy of consideration:

This article argues that Karl Marx and Friedrich Engels’s theory of history contained racist components. In Marx and Engels’s understanding, racial disparities emerged under the influence of shared natural and social conditions hardening into heredity and of the mixing of blood. They racialized skin-colour groups, ethnicities, nations and social classes, while endowing them with innate superior and inferior character traits. They regarded race as part of humanity’s natural conditions, upon which the production system rested. ‘Races’ endowed with superior qualities would boost economic development and productivity, while the less endowed ones would hold humanity back. Marxist race thinking reflected common Lamarckian and Romantic-Nationalist assumptions of the era.

Their numerous horrendous comments on Slavs, ‘Negroes’, Bedouins, Jews, Chinese and many others are well known, but the logic behind these comments has not been sufficiently examined.

Marx and Engels applied the term “Rasse”, or the English “race”, to a wide variety of human collectives – from skin-colour groups to ethnicities, nations and even to social classes. It is tempting to assume that they were applying the term loosely and that they were only unthinkingly repeating the stereotypes and prejudices of the day. On the contrary, I will argue that this common interpretation means to miss the serious points they were making. Whereas formal definitions and theories of race indeed cannot be found in their writings, their scattered comments add up to quite a coherent position on the question.

Nathaniel Weyl stands alone in having offered a solution for this problem. This author suggested that ‘historical materialism might be superimposed on certain more fundamental conditions which shaped man’s fate’. If Marx assumed that people of different races differed in ‘ability and hence in civilization-potential’, then ‘The more capable peoples would be expected to move more swiftly through the dialectically determined phases of the historical process and this would in turn stimulate their civilizational level’. Weyl suggests, in other words, that Marx regarded race as an element underlying the economic basis of society.”

In Conclusion:

Originally, the Marxist Left built its political program on the theory of class conflict. Karl Marx believed that the primary characteristic of industrial societies was the imbalance of power between capitalists and workers. The solution to that imbalance, according to Marx, was revolution: The workers would eventually gain consciousness of their plight, seize the means of production, overthrow the capitalist class and then usher in a new, ideal socialist society.

During the 20th century, a number of regimes underwent Marxist-style revolutions, and each ended in disaster. Socialist governments in the Soviet Union, China, Cambodia, Cuba and elsewhere racked up a body count of well north of 100 million people. They are remembered for gulags, show trials, executions and mass starvations. In practice, Marx’s ideas unleashed man’s darkest brutalities.

Marxist intellectuals in the West eventually came to realize that workers’ revolutions would never occur in Western Europe or the United States, which had large middle classes and rapidly improving standards of living. Marxist scholars in the West simply adapted their revolutionary theory to the social and racial unrest of the 1960s. Abandoning Marx’s economic dialectic of capitalists and workers, they substituted race for class and sought to create a revolutionary coalition of the dispossessed based on racial and ethnic categories.

In contrast to equality, “equity” as defined and promoted by critical race theorists, is little more than reformulated Marxism. In the name of “equity”, a UCLA law professor and critical race theorist Cheryl Harris has recently proposed suspending private property rights, seizing land and wealth and redistributing them along racial lines. This is allegedly proposed to promote equality of outcomes, based on quotas and affirmative action preferentially to various groups based on their victimhood status, rather than in proportion to an individual’s personal merit, their intellect or capacity to learn, their level of tenacity and effort, or their work ethic.

The achievements and rewards of those who sacrifice their time and energy in toiling to attain the realisation of long term goals, those who show intelligence and application in their approach to managing their lives, those who demonstrate thrift and are wise in investing and/or saving their hard earned dollar toward a delayed (and often enhanced) reward, and those who unselfishly donate their energies and skills to achieving higher order aspirations that would otherwise be deemed too difficult or laborious to contemplate, are instead sacrificed to those who merely feel entitled to the achievements and rewards of others, but are in reality too ignorant, indolent, indignant, or too envious to be bothered to tread such a difficult path to success, but rather expect it to be appropriated, seized or redistributed to be “shared” with their less than worthy selves.

Thus it can be readily seen from all I have written above, that Marxism is indeed the blueprint for the political exploitation of envy, and a society that becomes toxic with envy is one that inevitably destroys (by degrees) the individual, the family unit and then the broader society in turn. A society that has the misfortune to become victim to this culture of covetousness to which Marxism (in its various guises and forms) so deliberately aspires, is one that is doomed to failure, as the litany of those exemplar nations who have embraced Marxist theory throughout the 20th Century readily attest.

Woke Ideology and the Free World’s Cultural Revolution

When discussing the so called “woke ideology”, and the Cultural Marxism at it’s fundamental core, two quotes from opposite sides of the political spectrum spring to mind:


Political correctness is America’s newest form of intolerance, and it is especially pernicious because it comes disguised as tolerance.
George Carlin

and

Political Correctness is tyranny with manners.”
Charlton Heston

The ideology of “wokeness” is a concept that claims to believe in the importance of being “aware” of the effects of oppression and social injustice of the cultures of the past on society. This ideology allegedly seeks to challenge the status quo and supposedly improve the lives of arbitrarily defined “marginalised” people.

In its contemporary use, “woke” means social awareness, and being alert to social issues, racism, discrimination and injustice. Woke therefore carries a connotation of a shift in perspective — of waking up from a comfortable fantasy about the world, and realising that it isn’t what you previously had thought it was. For Black activists in particular, it then implies a call to action.

Under the banner of “tolerance” and “social justice”, the woke culture attacks anyone, that holds differing views. The purveyors of this culture see everything through the lens of racism, sexism, homophobia, transphobia, xenophobia and hurt feelings. Even if this were a deeply held belief, this becomes an extremely effective smokescreen because they can conveniently apply those labels to anyone who challenges them, and bully them incessantly without conscience as a result.

They also claim that they believe in free speech, but will only allow that speech which is approved by them, thereby actively suppressing free thought and intellectual inquiry. They have no tolerance for a diversity of opinions – which they suppress in the name of tolerance. This amounts to a pernicious form of censorship, designed to intimidate and “cancel” those with dissenting views to reach a uniformity of thought.

Some of the goals of the woke culture are: to dissolve the nuclear family, abolish capitalism, eliminate religion, re-write the constitution and raise children as gender neutral. Other actions by the woke culture, include such things as canceling certain films or books, and lobbying for the defunding of law enforcement, while simultaneously opposing prosecution of crimes committed by select “oppressed” groups within our societies.

The “woke” ideology is the cause at the heart of the intellectual rabbit hole that modern society has launched headlong into, abandoning principles of tolerance, the rule of law, the presumption of innocence and due process in favour of intolerance (in the name of tolerance), vigilantism and mob “justice”, non-peaceful protests, trial by media and “kangaroo courts” of public opinion.

Belatedly, a growing band of truly progressive thinkers on the nominal Left of the political spectrum are beginning to realise that the genie being let out of the bottle is one likely to tear down the foundations of peace, stability and personal and individual freedom that took hundreds of years and many generations to establish, only to be completely gutted within a generation that, in absence of external foes to fight, turned on itself.

Perhaps a more cynical definition of “Woke” is a more accurate reflection of the pitfalls of this ideology:

A person or group so hypersensitive to offence, and so pathetically desperate to appear socially conscious, that they will trample over any principle, commit any violent act, impinge upon any freedom, or impugn any reputation of anyone or anything that stands in the way of them signalling just how eminently “virtuous” they believe they truly are.

As examples of the cultivated ignorance and confected outrage of this woke ideology are the concepts this way of thinking has produced:

a) “Silence is violence“: – Nothing epitomises the rampant intolerance and the insidious intent to wield political power with impunity to bully others quite like this emblematic phrase of the woke Left. Essentially, anyone who is not vocal in whole-hearted support of any and every strongly held belief of the woke ideology is complicit in “oppression” of the self-selected “marginalised” groups, and is guilty of perpetrating some form of latent violence directed toward them by their silence or inactivity.

It is hard to imagine a more unjust or more pernicious idea than this, making one not just “guilty” for what one says, but what one doesn’t say. By failing to utter certain words, prayers, or pledges somehow becomes transformed into a confession of complicity or guilt. That demand for public affirmation has a profound chilling effect on freedom of speech, but more importantly the freedom to keep one’s own counsel and remain neutral or uncommitted in any contest of ideas, which should be a basic human right.

This ideologic mantra has resonant echoes of, and disturbing parallels to similar ideas that became foundational in the Chinese Cultural Revolution, where Red Guards hounded and publicly shamed those academics or citizens not deemed adherent enough to Maoist teachings and philosophy.

In the simplest possible terms, silence is NEVER violence. Ever. It is being manipulated by the radical Left as a speech code, which then becomes a speech command that everyone MUST comply with, or else. It is the working definition of enforced orthodoxy, where no dissenting voice or opinion is allowed, or tolerated.

b) “White fragility“: – Yet another catch cry of radical Leftist race activists, this phrase was coined by Robin D’Angelo, a “diversity consultant” (irony implied) who argues that “whites” in America must face the racist bias implanted in them by a racist society. Their resistance to acknowledging this, she maintains, constitutes a “white fragility” that they must overcome in order for meaningful progress on both interpersonal and societal racism to happen.

Not only is this an example of contemptible racial profiling of all people of Caucasian heritage, and an argument by reduction in describing “white” people as some homogeneous, blanc mange of humanity who all have the same opinions, the same thought processes and the same entrenched beliefs and biases. In the minds of such “diversity” advocates, there is no diversity at all amongst “white” people, no matter whether on the Left or Right of the political spectrum, or whether they be religious or atheist, serf or aristocrat, living free in Western democracies or suffering under communist oppression, or from the working class on Struggle St or living “high on the hog” in the lap of luxury. It is an astonishing display of arrogance to suggest that racial socialisation and identity does not differ by culture, ethnicity and nationality, or more importantly still between and amongst widely varying individuals.

In D’Angelo’s mind, to quote her: “Whites produce and reinforce the dominant narratives of society, such as individualism and meritocracy.” Even here, she has (along with those who have been influenced by her narrative) such a skewed vision of reality that she believes that those two concepts are exclusive to “whiteness”, yet many Asian cultures, for but one example, are even more meritocratic than those found in the West, and every culture bar those under the yoke of Communism oppression (whether Caucasians or Asians) tend to value individuality over conformity and uniformity, including those of African or other POC heritage.

Any society that does not value the individual and meritocratic virtues is doomed to poverty and failure, and to suggest that African Americans are incapable or disinterested in merit or meritocratic structures, or are not defined as individuals, is as paternalistic as it is inherently racist. With a stroke of a pen, D’Angelo undercuts the diversity and tremendous achievements, sometimes against great adversity, by a legion of meritorious African Americans over the last two hundred years, who could not possibly have achieved to that level of accomplishment without a desire for the precision and excellence that can only come from a deep appreciation of the virtues of merit, and with a singularity of purpose that only exceptional individuals can muster.

c) “Micro-aggressions“: – are defined as comments or actions that subtly and often unconsciously or unintentionally express a prejudiced attitude toward a member of a “marginalised” group (such as a racial minority). Whilst it can in some instances be true that small, cumulative slights can be upsetting to some, the term “micro-aggressions” is an arbitrary and inapt one, since the vast majority of these events are not remotely aggressive , being neither hostile, nor demonstrating violent behaviour, nor in forceful pursuit of one’s own interests. The mutual exclusivity of “unintentional” and “aggression” is not even attempted to be resolved in this world view, as the term disappears in a morass of its own internal contradictions.

Rather than a much needed appeal for civility, respect and consideration, “micro-aggressions” can be on such a microscopic scale as to be imperceptible, except to the most hypersensitive of the perpetually offended. This becomes a prison for both the offended, as much as it is for the offender, since those who perceive themselves to be marginalised or oppressed have to be ever-vigilant for even the slightest of transgressions by others, no matter how imperceptible and unintentional (or even at times imagined), that they must then process these “offences” as further oppressing and disadvantaging them when compared with their peers. Such a fixation on microscopic slights and insults is a maladaptive response that entrenches one’s perceived victimhood, rather than allowing one to rise above such issues and become impervious to them. It is only through refusing to be a victim that one can break a cycle of self pity and demoralisation.

Attempts to publicly shame, punish and even cancel “microaggressors” are uniformly disproportionate to the level of offence, with no recourse to explanation or a well argued defence, and are solely based on the subjective, emotional responses of a self-perceived “victim”, whose feelings are considered sacrosanct.

d) “Safe Spaces“: – This is a term that initially derived within the mental health field, an “umbrella term referring to non-clinical, peer-led supports for people in suicidal crisis. These spaces aim to provide an alternative to conventional mental health and hospital services, and are usually operated by peer workers with a lived experience of suicide.”

This model for “emotional refuges” has been adapted, and then co-opted to incorporate those who consider themselves historically “marginalised” (and in some cases “oppressed”), who then seek to congregate at extra-curricular safe havens on university campuses in particular, to effectively segregate themselves from the broader range of their peers, whom they perceive to threaten or oppress them. If that were not enough of a misguided and ill-conceived idea, a form of self-imposed apartheid to isolate them still further from the broader community, this ideology further extends to insulating themselves from reasoned debate or ideas that might challenge their preconceived notions, particularly ones that cannot withstand scrutiny or analysis in the public space.

As Harvard Dean of Students John Ellison wrote :

Our commitment to academic freedom means that we do not support so-called trigger warnings, we do not cancel invited speakers because their topics might prove controversial, and we do not condone the creation of intellectual ‘safe spaces’ where individuals can retreat from ideas and perspectives at odds with their own.

Such a cloistered and cosseted intellectual space, if anything, merely reduces their academic outlook to a parochial and utterly provincial worldview, one that cannot hope to compete with those whose mental acuity has been sharpened, and whose skills have been honed in a cosmopolitan and cultivated educational environment where all sides of a debate, and a wide variety of opinions and attitudes are encouraged to engage in a contest of ideas.

Cultural Marxism: The Ideology Underpinning the Woke Agenda

Cultural Marxism” (also known as Neo-Marxism, Libertarian Marxism, Existential Marxism, or Western Marxism), in spite of furious denials and cries of “conspiracy theory” from Left aligned commentators, is actually a well-established term in academic circles, and has appeared in the titles of numerous books and articles that treat it either dispassionately or favourably. The term simply refers to a twentieth-century development in Marxist thought that came to view Western culture as a key source of human oppression. At its essence, “Cultural Marxism is nothing more than the application of Marxist theory to culture“.

The influential Italian Marxist philosopher, Antonio Gramsci, imprisoned under Mussolini’s regime, was the first to reconsider Marxist philosophy in cultural terms, inverting it’s emphasis on economics determining culture, to place culture at the forefront of transforming the economic landscape.

How can this be achieved? Through an army of Marxist aligned intellectuals undertaking “the long march through the institutions of power”; gradually colonising and ultimately controlling all the key institutions of civil society (e.g. the family unit, the church, the bureaucracy, trade unions, the education system).

As Gramsci put it,In the new order, Socialism will triumph by first capturing the culture via infiltration of schools, universities, churches and the media by transforming the consciousness of society. If you hope to change the economic structure of society, you must first change the cultural institutions that socialise people into believing and behaving according to the dictates of the capitalist system. The only way to do this is by cutting the roots of Western civilisation – in particular, its Judeo-Christian values, for these (supposedly) are what provide the capitalist root system. In short, “unless and until Western culture is dechristianised, Western society will never be decapitalised“.

Almost simultaneously, but independently of Gramsci, the Marxist think tank and research centre, known as “The Institute for Social Research” – or more commonly known as the Frankfurt School under the leadership of Max Horkheimer became convinced that the major obstacle to human liberation was the capitalist ideology embedded in traditional Western culture.

Horkheimer recruited a range of up-and-coming Marxist intellectuals – notably, Theodore Adorno (1903–1969) and Herbert Marcuse (1898–1979) – who could help to blend classical Marxist doctrines with both Darwinian sociology and Freudian psychology. The aim was to produce a new, synthesised form of Marxism (Neo-Marxism) that would do the job that classical Marxism failed to do; radically transform Western culture and so help pave the way for a communist utopia.

This led the Frankfurt School to the development of “Critical Theory”, a form of incisive social critique aimed at undermining the status quo in the hope of changing society, allegedly for the better. Critical Theory stands opposed to what Horkheimer referred to as Traditional Theory, which aimed only at explaining society.

Whilst Adorno came to regret and disavow the negative consequences of violence and social upheaval that the doctrine of the Frankfurt School unleashed, Herbert Marcuse and others were entirely divorced from such considerations, obsessed instead with a mindset of pessimistic utopianism that was determined to destroy the western civilisation they collectively despised, in a nihilistic rage that rejoiced in the ruins they would likely cause, without having any solutions whatsoever to propose to replace those values and systems of governance.

Given this lack of conscience or moral compass, and divorced from consequences this Marxist philosophy would entail, it becomes a very short step from Marcuse’s “repressive tolerance” to the pernicious influence of political correctness, free speech crackdowns, de-platforming, and an epidemic of thuggish, and even occasionally violent university “protests,” and then to Antifa intimidation tactics and violence directed against illusory “fascists”, defined arbitrarily as anyone who is not a radical Marxist activist.

China‘s Cultural Revolution, where Cultural Marxism is taken to its logical conclusion:

Between 1966 and 1976, the young people of China rose up in an effort to purge the nation of the “Four Olds”: old customs, old culture, old habits, and old ideas. In other words, a radical transformation of the culture was to be undertaken, with violence and societal upheaval the inevitable result.

From Encyclopaedia Brittanica:

The Great Proletarian Cultural Revolution was launched under the direction of the Chinese Communist Party (CCP) under Chairman Mao Zedong, who wished to renew the spirit of the communist revolution and root out those he considered to be “bourgeois” infiltrators—alluding, in part, to some of his CCP colleagues who were advocating a path for economic recovery that differed from Mao’s vision.

In the beginning, Mao pursued his goals through the Red Guards, groups of the country’s urban youths that were created through mass mobilization efforts. They were directed to root out those among the country’s population who weren’t “sufficiently revolutionary” and those suspected of being “bourgeois.”

The first targets of the Red Guards included Buddhist temples, churches, and mosques, which were razed to the ground or converted to other uses. Sacred texts, as well as Confucian writings, were burned, along with religious statues and other artwork. All sorts of antiquities and artifacts were taken from museums and private homes and were destroyed as symbols of “old thinking”. Any object associated with China’s pre-revolutionary past was liable to be destroyed.

In their fervor, the Red Guards began to persecute people deemed “counter-revolutionary” or “bourgeois,” as well. The Guards conducted so-called “struggle sessions“, in which they heaped abuse and public humiliation upon people accused of capitalist thoughts (usually these were teachers, monks, and other educated persons). These sessions often included physical violence, and many of the accused died, committed suicide, or ended up being held in re-education camps for years. 

The Red Guards had little oversight, and their actions led to anarchy and terror, as “suspect” individuals—traditionalists, educators, and intellectuals, for example—were persecuted and killed. For the entire decade of the Cultural Revolution, schools in China did not operate, leaving an entire generation with no formal education. All of the educated and professional people had been targets for re-education. Those that weren’t killed or maimed, were dispersed across the countryside to toil away on farms, or in other hard manual labour.

The Red Guards were soon reined in by officials, although the brutality of the revolution continued. The revolution even saw high-ranking CCP officials falling in and out of favor, such as Deng Xiaoping and Lin Biao.

The revolution eventually ended in the fall of 1976, after the death of Mao in September and the downfall of the so-called “Gang of Four” (a group of radical pro-Mao CCP members) the following month, although it was officially declared over in August 1977 by the 11th Party Congress. The revolution left many people dead (estimates range from 500,000 to 2,000,000), displaced millions of people, and completely disrupted the country’s economy. Although Mao had intended for his revolution to strengthen communism, it had, ironically, the opposite effect, instead leading to China’s embrace of capitalism.

The Free World’s Cultural Revolution Redux:

The parallels between China’s Cultural revolution and the current tsunami of Woke ideology under the influence of Cultural Marxism in the West could not be more obvious, with the far Left activists determined to mimic many of the worst excesses of this catastrophic example.

The roving gangs of Red Guards are replaced in the modern example with “anti-fascist” activists, who pounce upon even the most innocuous comments or actions as a pretext for unleashing torrents of abuse and name-calling, doxxing (publicly revealing previously private personal information about an individual or organisation), public shaming and lobbying for expulsion from one’s community or employment, or, in the worst cases, even assaulting and maiming in the street.

Like their Chinese antecedents, these youthful, “grass roots” zealots engage in the whole suite of vigilante justice from stalking to harassment to intimidation to cancellation against not only their political adversaries, but as “Silence is Violence” clearly demonstrates, even those who merely do not come out in enthusiastic support for the behaviour and ideology of the Marxist Left.

Anyone who is not aligned to the radical Left is immediately labelled a fascist, Nazi, homophobe, transphobe, racist or bigot, regardless of the actual propensities of the individual involved. This latter propensity is somewhat beside the point and irrelevant to them, since it is the exercise of power and intimidation of others that is of most interest to them, ostensibly in service of their “cause”, but more often fulfilling a deeply rooted need to punish others, and to aggrandise themselves as morally superior.

Like the Chinese Cultural Revolution before it, the insistence on ideological purity gives often quite unenlightened student activists the license to berate, threaten and intimidate their betters: teachers and professors whose world view is significantly more nuanced and sophisticated than theirs could ever hope to be.

The indoctrination practices, inculcated through Marxist ideological priming, are often more subtle and indirect than in days gone by, being interwoven into the education process via carefully crafted curricula, or by teacher-led discussions producing the “desired” conclusion. Children’s and adolescents’ beliefs are often molded without them even realising it. The methods being utilised are modern, and the tools of submission and conformity change from time to time depending on the opportunity, but the ideological goals remain the same.

Consequences of Woke Ideology on the Democratic West:

As a consequence of this brave, new Free World Cultural Revolution, societies that once valued Courage, Individuality, Self-reliance, Critical Thinking, Industry, Creativity, Philanthropy, Family and Service to Community above all else, have gradually replaced these laudable values and principles with a propensity to Welfarism, Complacency, Apathy, Dependence, Decadence, Narcissism and Group Think. It is clear that woke political activist groups have reached a critical mass in the West, and their aim is not to improve society but destroy it, following the example of Marcuse in tearing down institutions with not a plan or an idea of what could possibly replace them.

Without these moral and social underpinnings, and harbouring vicarious guilt for perceived past injustices allegedly perpetrated by people before they were even born, and by people they have never even met, is it any wonder that children and teenagers grow up without a sense of any true self-worth, and are insecure in their place in the world , resorting instead to dependence upon social media, drugs and alcohol, and mindless distractions to give them a purpose that they cannot otherwise achieve in the “real world”?

Western nations are comprised of real people with real aspirations, who put in real hard work and sweat in order to establish themselves and make a life from the sometimes limited opportunities given them, only to have it all snatched away in a heartbeat by indoctrinated ideologues and unruly mobs hellbent on destruction for its own sake, using often tenuous social justice issues as a pretext to wantonly destroy that which others had the fortitude and tenacity to build.

The true aims of the Cultural Marxist perspective and the woke ideology is not to promote social justice (as it would hope to claim), but instead to remove a sense of pride and respect for one’s own culture, family and community, and of life being purposeful and worthy. They seek to replace it instead with nihilistic hatred or self-abnegating disdain for the parents and grandparents who paved the way for us, the people around us who make up our neighbourhoods and communities, the authority figures like teachers, police, the courts of law and parliament, one’s nation as a whole, and in particular our predominantly Western-derived culture and civilisation.

The way to fight racism, as but one example of a social justice issue used as a pretext in this revolution, is to follow the aspiration of Martin Luther King in his wonderful, and emblematic “I have a dream” speech, where individuals should be “judged not by the colour of their skin, but by the content of their character”.  Instead of this laudable aspiration, we have begun on a path of the identifying oneself by one’s “tribe”, once again promoting the idea that one’s race defined one as an individual, and that those within the tribe have a shared common belief system and experience that those in other tribes cannot possibly understand. 

This creates an artificial demarcation and this characterisation serves not to promote racial harmony, but instead to further divide us all by race. Instead of looking forward to an era in the very near future where race is irrelevant (as it should be), and hence racism would become a thing of a dark past in the human experience, we now face a future of increasing racial intolerance, suspicion and separatism carefully cultivated by cynical people who derive unseemly pleasure in tearing down the pillars of decent society.

The Solution to Woke Ideology is Insistence on Adherence to Principle:

A significant number of people, particularly those in positions of power and influence, seem to have lost touch with some basic principles to which the West has previously striven to adhere. It is through the failure to adhere properly and consistently to these basic principles that the radical Left agenda has been allowed to flourish, by infiltrating those institutions meant to oversee and administer these principles with rigour and fairness.

1) Freedom of assembly, association and speech: allowing citizens to gather together, associate and engage without hindrance in the contest of ideas, including ideas that some might find offensive, and speech should only be limited for those who would use it to incite violence or physical harm to be inflicted on various individuals or groups. A free press is clearly an important component of free speech, and should be protected and encouraged to propagate a variety of views, not merely those convenient to the government of the day- in other words, the press must be truly free rather than cowered into submission under the yoke of an authoritarian government, or under the boot heel of various oppressive ideologies.

2) The right of the individual to human dignity and autonomy over their body or person, the liberty to pursue an individual’s personal goals, both public and private, unhindered by unwarranted government interference or oversight, provided it does not impact negatively on the rights or freedoms of other individuals.

3) Equality of application of the Rule of Law, rules which should apply equally to EVERY individual regardless of socio-economic, political, racial or religious status, and which are known by all, are predictable and applied impartially. Laws should as limited in scope as practicable for a functioning society, nor should it be arbitrarily, selectively or vindictively applied.

4) Government must obtain and then maintain as best as they are able the consent of the governed, where each citizen is able to participate in public discourse freely and without coercion, where tolerance for political opposition on all sides is scrupulously maintained, and where all citizens of appropriate age and mental capacity are afforded the opportunity to vote in free and fairly held elections that are transparent and carefully guarded against corruption, manipulation or malfeasance.

5) Government must be accountable to the people, must be transparent and fair in its application of its powers at hand, and should fairly strive for the common good of its citizens, without unwarranted infringement of the rights of the individual to life and liberty unhindered by unnecessary government or bureaucratic interference. There must be adequate safeguards, checks and balances against the abuse of power that our governments have been given, powers that they should view as a privilege entrusted to them by those they serve.

6) Governments should, as much as humanly possible, provide equality of opportunity (but not necessarily outcomes) for all citizens, and protect the rights and freedoms of minorities so long as those protections do not infringe on the rights and liberties of others. They should also be inclusive, respectful and welcoming of those minorities in our nation’s political discourse and processes, and we should allow them equal access and pathways (but not prioritisation) to entry and engagement in the political sphere.

7) The separation of powers between the legislative, executive and judicial branches of government should be scrupulously defended and maintained. The legislature that makes the laws should remain independent of the executive that put these laws into operation, and the judiciary should then interpret these laws independently of one another. Whilst each remains autonomous and independent of each other, each branch should adhere, as much as is possible, to the fundamental principles outlined above, and be held accountable if any branch fails in its duties to those principles.

Whilst these principles should be somewhat self-evident, and form the aspirational framework for any individual or group seeking power and influence in society to optimise its health and function, the woke ideologues have something other than individual and collective happiness, social cohesiveness, wealth creation and economic prosperity in mind in pursuit of their ideological goals. The only way for such ideologues to succeed in undermining these principles is for good men and women to do nothing in their defence, allowing the unprincipled free reign to destroy what took many generations to build and preserve.

Climate Myopia: How the Current Generations Diminish the Past to Exaggerate the Present and Catastrophise the Future

What strikes me about “Climate Change” advocates is how readily they nominate current weather conditions to be “unprecedented”, when even a cursory scanning of the near and distant past finds almost identical, and often far worse examples of extreme weather events, the majority of which occurred prior to 1945, when anthropogenic CO2 emissions became significant, and therefore without the benefit of any possible man-made causation.

The global temperature since the last glacial period (~11,000 years ago) has actually varied widely quite naturally over the entire Holocene interglacial period up until the present day, without there being any possible influence from mankind’s industrial or agricultural activity for the vast majority of that time. 

Clear peaks are seen in the paleo-climate record, coinciding most recently with the known Minoan Warming Period (3,300 yrs BP, i.e. Before Present), the later Roman Warming Period (2,150 yrs BP) and finally the Medieval Warming Period (1,100 yrs BP), which have preceded the current rise in temperature since 1880, which might be termed the Modern Warming Period (150yrs BP till present), for the sake of convenience. 

The early Holocene, known commonly as the Holocene Climate Optimum, was a period of atmospheric temperatures that were probably between 2 and 6 deg C warmer than today, yet this cannot possibly be attributed to human activity, nor to CO2 concentration in the atmosphere (which varied between 260 to 270 ppm throughout), nor to any other known effect that the much vaunted climate computer models can simulate or attribute. 

Also, within the early Holocene the world’s oceans were probably 0.7°C warmer on average than today 8,000 ago, and more regionally, Northern Hemisphere ocean temperatures between 9,000 ago and 7,000 ago were likely around 2.5°C+/-0.4°C warmer than the late 20th Century (see Rosenthal et al, 2013). 

There were also very clear and quite rapid swings in temperature with periods with cooling of the poles, increased tropical aridity and major atmospheric circulation changes notable during the periods 9,000 to 8,000, 6,000 to 5,000, and 4,200 to 3,800 BP, these cooling periods are in addition to the aforesaid warming periods mentioned previously, as well in the intervals between abrupt cooling phases (Mayewski, et al.- Quaternary Research 62 (2004) p 243-255).

These unexplained climate changes were often very rapid in onset and independent of CO2 variability, thereby calling into question the idea of a stable, non-fluctuating global climate as proposed by Mann et al. in his (in)famous “Hockey Stick” paper.

Northern Africa is a particular case in point:

Researchers have identified five episodes over the past 6,000 years when dramatic changes occurred in Egypt’s climate, three of which coincided with extreme environmental changes as the climate shifted to more arid conditions. These colder, drying periods also coincided with upheaval in Egyptian society, such as the collapse of the Old Kingdom around 4,000 years ago and the fall of the New Kingdom about 3,000 years ago. There were three large pulses of aridification as Egypt went from a wetter to a drier climate, starting with the end of the African Humid Period 5,500 years ago when the monsoons shifted to the south. 

At the same time, human population densities were increasing, and competition for space along the Nile Valley would have had a large impact on both animal and human population. These colder, drier intervals have also previously led to the demise of the nearby Hittite empire in Anatolia (modern day Turkey) at this same time. A 2013 study of data from coastal sites in Cyprus and Syria showed evidence that a 300-year drought began around the beginning of the twelfth century B.C., coinciding with the Late Bronze Age collapse. A drought of this magnitude would have caused crop failures and widespread famine, as well as trade disruption across the Ancient Near East.

Prolonged global cooling periods were also likely a highly significant factor in the decline and fall of the Roman Empire and the subsequent period known as the Dark Ages in Europe in the period from around 400-800 A.D

In the Late Antique Little Ice Age (536 A.D to 630 A.D), for example, this coincided with the Justinian plague, the collapse of the Sasanian Empire, large scale migrations from the Asian steppe and Arabian Peninsula due to inhospitable climate conditions for subsistence, the spread of Slavic-speaking peoples across Eurasia, and to various political upheavals in China

This latter, generally colder climate era was then followed subsequently by the relative warmth and prosperity of the Medieval Warming Period, which saw the treelines in Scandinavia, the Polar Urals, the north of Siberia and the Northwest Territories of Canada tend much further north of their current northernmost limits, and at much higher altitudes also, pointing to a more benign and generally warmer climate within the Arctic circle conducive to promoting growth of these trees, many species of which are quite temperature sensitive (e.g Siberian larch).

From the Leif Kullman (Umea University) paper:

“The present paper reports results from an extensive project aiming at improved understanding of postglacial subalpine/alpine vegetation, treeline, glacier and climate history in the Scandes of northern Sweden. 

The main methodology is analyses of mega fossil tree remnants, i.e. trunks, roots and cones, recently exposed at the fringe of receding glaciers and snow/ice patches. 

This approach has a spatial resolution and accuracy, which exceeds any other option for tree cover reconstruction in high-altitude mountain landscapes. 

The main focus was on the forefields of the glacier Tärnaglaciären in southern Swedish Lapland (1470-1245 m a.s.l.). Altogether seven megafossils were found and radio-carbon dated (4 Betula, 2 Pinus and 1 Picea). Betula and Pinus range in age between 9435 and 6665 cal. yr BP. 

The most remarkable discovery was a cone of Picea abies, contained in an outwash peat cake, dating 11 200 cal. yr BP. The peat cake also contained common boreal ground cover vascular plant species and bryophytes. 

All recovered tree specimens originate from exceptionally high elevations, about 600-700 m atop of modern treeline positions. This implies, corrected for land uplift, summer temperatures, at least 3.6 °C higher than present-day standards.”

This era also included the Norse colonisation of Greenland, from around 980 A.D to 1430 A.D, and it also saw grapes grown several hundred kilometres north in England from there present northernmost limits, and figs and olives were grown routinely in regions that would now be unsuitable to sustain them. Not that this era was universally beneficial to all regions of the globe, given that this era also saw episodes of extreme aridity in what is now California, and in much of the Western half of North America, including two epic mega-droughts lasting well over a century in this region in ~800-950 A.D and again in ~1050-1200 A.D

When this era dwindled, it once again led to gradual global cooling which culminated in the lead up and aftermath of the Maunder solar minimum, a period known as “The Little Ice Age”. This era of generally declining temperatures and more frequent episodes of harsh cold for the Northen Hemisphere was arguably the coldest sustained period since the last glacial period 11,000 years BP, and this mini-Ice Age contributed to such inauspicious events as the Black Death, the Great Irish Frost of 1740 and the French Revolution, and which in addition significantly influenced the course of the Napoleonic Wars and the American War of Independence.

Closer to home in Australia, the recent Millennium Drought (2001–09) was clearly one of the most severe drought periods in recent history and seems “unprecedented”, until you look beyond the very recent climate record to the intensity and extent of the even more severe and more extensive Federation drought a mere hundred years prior, or the eight to ten other different severe drought periods that occurred between 1800 and the present, or the severe drought that nearly caused the First Fleet to perish in the 1788-90, or the paleo-climate data showing almost 8 decades out of 11 of severe drought during the Medieval Warming Period. Around 1100CE, Australia had perishing drought far worse than anything we have ever experienced in the modern era for 80 out of 100 years. Conditions were so bad, and game so scarce, that the indigenous population in the south migrated to Arnhem Land in the Northern Territory in search of water as the monsoons only tracked that far south. Hence the spiritual importance of Arnhem Land to indigenous Australians. It quite literally saved them from possible extinction.

From the National Museum website:

History of drought in Australia 

Historical accounts and scientific analysis indicate that South-Eastern Australia experienced 27 drought years between 1788 and 1860, and at least 10 major droughts between 1860 and 2000

The Federation Drought from 1895 to 1903 was the worst in Australia’s recorded history, if measured by the enormous stock losses it caused. It ended squatter-dominated pastoralism in New South Wales and Queensland, as bank foreclosures and the resumption of leases led to the partition of large stations for more intensive settlement and agricultural use. Many consider this drought, which affected almost the whole country, to have been the most destructive in Australian recorded history, owing to the enormous toll it took on sheep and cattle numbers. 

Many pastoralists were overwhelmed by the debt they incurred buying feed, controlling rabbits and repairing infrastructure damaged by dust storms. Banks foreclosed on some stations and many graziers walked off their land. Some five million acres of leasehold country in the NSW Western Division was reported to have been abandoned between 1891 and 1901.”

The recent south-eastern Australian bushfires in 2019/2020 seem unprecedented until you compare them with the 1851 Victorian bushfires, which burned more than a quarter of the entire state of Victoria from a single ignition point.

By contrast, the history of floods in Australia also show the current floods in southeast Queensland and Northern New South Wales are scarcely “unprecedented” as has been claimed in some quarters. The Black February Brisbane floods are particularly instructive:

The 1893 Brisbane flood, occasionally referred to as the Great Flood of 1893 or the Black February flood, occurred when the Brisbane River burst its banks on three occasions in February 1893. There was also a fourth flood event later in the same year in June. It was flooded in the first flood on 6 February due to a deluge associated with a tropical cyclone, called “Buninyong”. The second cyclone struck on 11 February, causing relatively minor flooding compared to the first flood. When the third cyclone came on 19 February, it was almost as devastating as the first, and it left up to one third of Brisbane’s residents homeless. 

For the first flood, Crohamhurst recorded an all-time Australian record of 907 millimetres (35.7 in) of rain in a 24-hour period.  The water surge was recorded on the Port Office gauge (now the City gauge) as being 8.35 metres (27 feet, 5 inches) above the low tide level. The February 1893 floods were the second and third highest water levels ever recorded at the City gauge, the highest being the January 1841 flood at 8.43 metres (27 feet, 8 inches).

There was, however, some oral Aboriginal history suggesting a flood level of nearly 12 metres prior to the first European settlement. The 1893 events were also preceded by two notable yet less severe floods in 1887 and 1890.

The world’s highest ever temperature was at Death Valley in 1913 at 56.7 deg C (a tick over 134 deg F), but the longest ever heatwave occurred in Australia at Marble Bar in 1923/24 of 160 straight days above 37.8 deg C (notwithstanding the Australian Bureau of Meteorology’s Orwellian interventions to eliminate these records, and to continuously redefine the meaning of the term “heatwave”).

Then comes the severe NSW Heatwaves in 1896 which killed 437 people, and caused an en masse exodus from regional NSW to the coast. From Lance Pidgeon and Chris Gilham:

“In January 1896 a savage blast “like a furnace” stretched across Australia from east to west and lasted for weeks. The death toll reached 437 people in the eastern states. Newspaper reports showed that in Bourke the heat approached 120°F (48.9°C) on three days. The maximum at or above 102 degrees F (38.9°C) for 24 days straight.

By Tuesday Jan 14, people were reported falling dead in the streets. Unable to sleep, people in Brewarrina walked the streets at night for hours, the thermometer recording 109F at midnight. Overnight, the temperature did not fall below 103°F. On Jan 18 in Wilcannia, five deaths were recorded in one day, the hospitals were overcrowded and reports said that “more deaths are hourly expected”. By January 24, in Bourke, many businesses had shut down (almost everything bar the hotels). Panic stricken Australians were fleeing to the hills in climate refugee trains.  As reported at the time, the government felt the situation was so serious that to save lives and ease the suffering of its citizens they added cheaper train services. It got hotter and hotter and the crowded trains ran on more days of the week. The area of exodus was extended to allow not only refugees from western NSW to flee to the Blue Mountains but also people to escape via train from the Riverina to the Snowy Mountains. “

Clearly, in Australia, the 1890s were the decade punctuated by multiple of the most extreme weather events ever experienced in recorded history, and would be more accurately referred to as “unprecedented” than anything experienced before or since. From the Federation Drought to the incredibly severe Category 5 Cyclone Mahina (the most intense cyclone to ever make landfall in Australia) in 1899, to the Black February Brisbane floods in 1893, to the severe NSW Heatwaves in 1896, the decade prior to the nation’s Federation in 1901 made the recent extreme weather events pale into insignificance.

Globally, a series of mega droughts are seen in the paleo-climate record to have occurred in sub-Saharan Africa from 130,000 to 90,000 years ago. Rainfall was so scarce that Lake Malawi’s water level dropped 2,000 feet (nearly 700metres), and lush forests were turned into arid scrubland. In Meso-America , a series of withering droughts from 750-900 AD led to the collapse of the Mayan civilisation, whilst recent Californian droughts pale into insignificance when compared to the two separate two hundred year mega droughts between the 9th and 12th centuries I mentioned previously.

Hurricane Katrina that struck New Orleans in 2005 seems uniquely devastating until you compare it to the Great Hurricane of the Lesser Antilles in 1780, which killed 22,000 people and felled or denuded every single tree in Barbados, or even the 1900 Galveston Hurricane that caused 8,000 to 10,000 deaths. Cyclone Yasi, which struck south-east Queensland in 2011 seems catastrophic, until you compare it to Cyclone Mahina in 1899 that struck the same area more than 100 years before. Cyclone Mahina, still the deadliest cyclone in recorded Australian history, had the lowest barometric pressure at 880 mbars ever recorded in the Southern Hemisphere, and a storm surge height of 13metres (43 feet).

“The 1780 Atlantic hurricane season ran through the summer and fall in 1780. The 1780 season was extraordinarily destructive, and was the deadliest Atlantic hurricane season in recorded history with over 28,000 deaths. Four different hurricanes, one in June and three in October, caused at least 1,000 deaths each; this event has never been repeated and only in the 1893 and 2005 seasons were there two such hurricanes.”

“The Great Hurricane of 1780, also known as Huracán San Calixto, the Great Hurricane of the Antilles, and the 1780 Disaster, is the deadliest Atlantic hurricane on record. Between 22,000 and 27,501 people died throughout the Lesser Antilles when the storm passed through them from October 10–16. Specifics on the hurricane’s track and strength are unknown because the official Atlantic hurricane database goes back only to 1851.”

There was, of course, no way of accurately measuring hurricane intensity as they do now, but the salient point was that there were 4 category 4 or 5 equivalent hurricanes in a 12 month period, in the midst of the Little Ice Age, and well before the industrial revolution and any possibility of “man-made” weather events.

“This hurricane was first encountered by a boat in the eastern Caribbean Sea, but it probably developed in early October in the eastern Atlantic Ocean off the Cape Verde Islands. The system strengthened and expanded as it tracked slowly westward and first began affecting Barbados late on October 9. Late on October 10, the worst of the hurricane passed over the island, with at least one estimate of winds as high as 200 mph (320 km/h) during landfall, which is higher than any other 1-minute sustained wind speed in recorded Atlantic basin history.”

“The hurricane stripped the bark off trees and left none standing on Barbados. Cuban meteorologist José Carlos Millás has estimated that this damage could be caused only by winds exceeding 200 miles per hour (320 km/h). Every house and fort on Barbados was destroyed. According to British Admiral George Brydges Rodney, the winds carried their heavy cannons aloft 100 feet (30 m).”

Not that the extreme weather events of the Little Ice Age were confined to severe hurricanes, as Jim Steele writes:

“A series of Little Ice Age droughts lasting several decades devastated Asia between the mid 1300s and 1400s. Resulting famines caused significant societal upheaval within India, China, Sri Lanka, and Cambodia. 

Bad weather resulted in the Great Famine of 1315-1317 which decimated Europe causing extreme levels of crime, disease, mass death, cannibalism and infanticide. 

The North American tree-ring data reveal mega-droughts lasting several decades during the cool 1500s. The Victorian Great Drought from 1876 to 1878 brought great suffering across much of the tropics with India devastated the most. More than 30 million people are thought to have died at this time from famine worldwide. 

The Little Ice Age droughts and famines forced great societal upheaval, and the resulting natural climate change refugees were forced to seek better lands. But those movements also spread horrendous epidemics. 

Wild climate swings brought cold and dry weather to central Asia. That forced the Mongols to search for better grazing. As they invaded new territories they spread the Bubonic plague which had devastated parts of Asia earlier. 

In the 1300s the Mongols passed the plague to Italian merchant ships who then brought it to Europe where it quickly killed one third of Europe’s population. European explorers looking for new trade routes brought smallpox to the Americas, causing small native tribes to go extinct and decimating 25% to 50% of larger tribes. Introduced diseases rapidly reduced Mexico’s population from 30 million to 3 million.”

“By the 1700s a new killer began to dominate – accidental hypothermia. When indoor temperatures fall below 48°F for prolonged periods, the human body struggles to keep warm, setting off a series of reactions that causes stress and can result in heart attacks. 

As recently as the 1960s in Great Britain, 20,000 elderly and malnourished people who lacked central heating died from accidental hypothermia. As people with poor heating faced bouts of extreme cold in the 1700s, accidental hypothermia was rampant.” 

The current alleged “Anthropocene”, a term much beloved of Climate Change Cassandras, is far preferable to such climate related calamities as the Great Famine of 1315-1318, the Black Death, the General Crisis of the 17th Century, the Great Storm of 1703, the Great Frost of 1709, the Great Hurricane of 1780, the Year without Summer (1816) and the Irish Potato Famine in 1845.

At the end of the 1300’s Great Famine and the Bubonic Plague epidemic, the earth sustained only 350 million people. With today’s advances in technology and milder growing conditions, record high crop yields are now feeding a human population that ballooned to over 7.6 billion. That in itself is prima facie evidence of the far more benign climate conditions that have underpinned that growth in sustainable population and the precipitous drop in climate related deaths compared to the centuries from 1300 to 1850.”

The Great Storm of 1703

From Jo Nova

Back when CO2 levels were ideal, the UK was hit by a monster nine-day storm: at least 8,000 dead, maybe as many as 15,000 people. Some 2,000  chimney stacks were blown down and 4,000 oak trees were lost in the New Forest alone. About 400 windmills were destroyed, with “the wind driving their wooden gears so fast that some burst into flames”. The worst toll was probably on ships — with some 6,000 sailors thought to be lost. As many as 700 ships were heaped together in the Pool of London, one ship was found 15 miles (24 km) inland. A ship torn from its moorings in the Helford River in Cornwall was blown for 200 miles (320 km) before grounding eight hours later on the Isle of Wight. 

Back then, people blamed the “crying sins of the nation” and saw it as punishment by God. The government declared 19 January 1704 a day of fasting, saying that it “loudly calls for the deepest and most solemn humiliation of our people”.

From the BBC

The storm uprooted thousands of trees; blew tiles from rooftops, which smashed windows in their paths; and flung ships from their moorings in the River Thames. A boat in Whitstable, Kent was blown 250m inland from the water’s edge.

As Britain slept, the wind lifted and dropped chimney stacks, killing people in their beds. It blew fish out of the ponds and onto the banks in London’s St James’s Park, beat birds to the ground and swept farm animals away to their deaths. Oaks collapsed and pieces of timber, iron and lead blasted through the streets. The gales blew a man into the air and over a hedge. A cow was blown into the high branches of a tree. Lightning kindled fires in Whitehall and Greenwich. From the hours of five in the morning until half past six, the storm roared at its strongest. It is thought between 8,000 and 15,000 people in total were killed.

Strong and persistent winds had already blown through the country for 14 days leading up to the storm. Those winds were already fierce enough to topple chimneys, destroy ships and blow tiles from the roofs of houses.

“In terms of its dramatic impact, it’s up there with the best of them,” says Dennis Wheeler, emeritus professor of climatology at the University of Sunderland. “Thousands of sailors died. The number was put at about 6,000. At the time, we were engaged with the War of the Spanish Succession, so we could ill afford to lose them. We lost a lot of ships, a lot of trade, and there was horrendous damage.”

The Great Irish Frost of 1740: The Longest Period of Extreme Cold in Modern European History

From Biot Report #442:July 13, 2007

An extraordinary climatic shock—the Great Frost—struck Ireland and the rest of Europe between December 1739 and September 1741, after a decade of relatively mild winters. Its cause remains unknown. Charting its course sharply illuminates the connectivity between climate change and famine, epidemic disease, economies, energy sources, and politics. David Dickson, author of Arctic Ireland (1997) provides keen insights into each of these areas, which may have application to human behaviors during similar future climatic shocks. The crisis of 1740-1741 should not be confused with the equally devastating Great Potato Famine in Ireland of the 1840s.

Though no barometric or temperature readings for Ireland (population in 1740 of 2.4 million people) survive from the Great Frost, Englishmen were using the mercury thermometer invented 25 years earlier by the Dutch pioneer Fahrenheit. Indoor values during January 1740 were as low as 10 degrees Fahrenheit. The one outdoor reading that has survived was 23 degrees Fahrenheit, not including the wind chill factor, which was severe. This kind of weather was “quite outside British or Irish experience,” notes Dickson.

From Dennis Avery:

Ireland suffered the most severely. In the depths of the winter of 1739-40, winds and terrible cold intensified. Rivers, lakes, and waterfalls froze and fish died in the first few weeks of the Great Frost. Coal dealers found their coal piles and unloading docks frozen solid. Mill-wheels froze, so the millers and bakers could produce no bread.

Ireland’s crucial potato crop—normally left in the ground until needed for food—froze underground. The tubers were ruined for food, and useless as seed for the following year. The following spring came drought, and the winds remained fierce. The winter wheat and barley sown the previous fall died in the fields. Sheep and cattle died in the pastures. The fall of 1740 saw a small harvest, but the dairy cattle had been so starved that few of them bore calves. Milk production plummeted as the cows’ milk dried up.

That winter, blizzards ranged along the coast, and great chunks of ice sweeping down the River Liffey sank vessels in the harbor. Dublin wheat prices rose to all-time highs.

Meanwhile dysentery, smallpox, and typhus were ravaging a weakened population. Farm workers had seldom gone to town when they had food on their tables; now they wandered in seeking food or relief assistance—and died in great numbers. All up over 400,000 people are thought to have perished.

At the other extreme, the Little Ice Age was also a time that experience episodes of great drought, particularly the European mega-drought of 1539/40:

From Andrew Frey:

Eleven months without rain, a million deaths – in 1540, a previously and since unprecedented drought devastated all of Europe. 

For eleven months there was almost no rain, the temperature was five to seven degrees above the normal values ​​of the 20th century, the temperature must have climbed above 40 degrees Celsius in midsummer. Countless forest areas in Europe went up in flames, acrid smoke obscured the sunlight, and not a single thunderstorm was recorded for the entire summer of 1540.

July brought such a terrible scorching heat that the churches sent out prayers of supplication while the Rhine, Elbe and Seine could be waded through without getting wet. Where water still flowed, the warm broth turned green, and fish floated in it keel up. The level of Lake Constance dropped to a record level, and Lindau was even connected to the mainland. The surface water soon evaporated completely, the floors burst open, some drying cracks were so large that a foot could fit in them.

And the groundwater dropped too: in the Swiss canton of Lucerne, desperate people tried to dig for water in a river bed, but found not a single drop even at a depth of one and a half meters. Christian Pfister therefore estimates that only a quarter to a maximum of a third of the usual amount of rain came from the sky that year.

Severe heat and drought in summer and autumn also afflicted Silesia, where there was practically no rain for 6 months. Many streams dried up and the water in the River Oder turned green. There were frequent forest fires and livestock suffered from hunger and thirst (Büsching, 1819). Similarly, severe heat, forest fires, poor harvest, shortages, and famine were mentioned for Bohemia, Silesia, and Lusatia (Gomolcke, 1737). In Greater Poland, summer and autumn were also very dry; it did not rain until the beginning of winter. Rivers were exceedingly low, brooks, ponds, and wells dried up and the land was desiccated to dust.” (Rojecki et al., 1965)

… water levels were low everywhere; it was even possible to ride or walk across the River Rhine.” (Brazdil et al.)

The LIA was also not confined to Europe, as Climate Change believers would like to suggest:

Africa: 

“In Ethiopia and Mauritania, permanent snow was reported on mountain peaks at levels where it does not occur today. Timbuktu, an important city on the trans-Saharan caravan route, was flooded at least 13 times by the Niger River; there are no records of similar flooding before or since.”

and 

“In Southern Africa, sediment cores retrieved from Lake Malawi show colder conditions between 1570 and 1820, suggesting the Lake Malawi records “further support, and extend, the global expanse of the Little Ice Age. A novel 3,000-year temperature reconstruction method, based on the rate of stalagmite growth in a cold cave in South Africa, further suggests a cold period from 1500 to 1800 “characterising the South African Little Ice age.”

Antarctica:

”Law Dome ice cores show lower levels of CO2 mixing ratios during 1550–1800, results that Etheridge and Steele conjecture are “probably as a result of colder global climate.”

and

“Sediment cores in Bransfield Basin, Antarctic Peninsula, have neoglacial indicators by diatom and sea-ice taxa variations during the period of the LIA. The MES stable isotope record suggests that the Ross Sea region experienced 1.6 ± 1.4 °C cooler average temperatures during the LIA in comparison to the last 150 years.”

Australia/New Zealand:

“Lake records in Victoria suggest that conditions, at least in the south of the state, were wet and/or unusually cool. In the north, evidence suggests fairly dry conditions, while coral cores from the Great Barrier Reef show similar rainfall as today, but with less variability.”

and

“On the west coast of the Southern Alps of New Zealand, the Franz Josef glacier advanced rapidly during the Little Ice Age, reaching its maximum extent in the early eighteenth century, in one of the few places where a glacier thrust into a rain forest.”

“Based on dating of a yellow-green lichen of the Rhizocarpon subgenus, the Mueller Glacier, on the eastern flank of the Southern Alps within Aoraki/Mount Cook National Park, is considered to have been at its maximum extent between 1725 and 1730.”

Pacific Islands:

“Sea-level data for the Pacific Islands suggest that sea level in the region fell, possibly in two stages, between 1270 and 1475. This was associated with a 1.5 °C fall in temperature (determined from oxygen-isotope analysis) and an observed increase in El Niño frequency.”

”Tropical Pacific coral records indicate the most frequent, intense El Niño-Southern Oscillation activity in the mid-seventeenth century.”

South America:

“Tree-ring data from Patagonia show cold episodes between 1270 and 1380 and from 1520 to 1670, periods contemporary with LIA events in the Northern Hemisphere.”

and

”Eight sediment cores taken from Puyehue Lake have been interpreted as showing a humid period from 1470 to 1700, which the authors describe as a regional marker of LIA onset.”

“A 2009 paper details cooler and wetter conditions in southeastern South America between 1550 and 1800, citing evidence obtained via several proxies and models. 18O records from three Andean ice cores show a cool period from 1600–1800.”

Source: Wikipedia

In the past, the contention has been made by Climate change advocates that Global, and specifically Arctic warming observed in recent decades is in some way responsible for increasingly wavy, meandering jet streams. 

In reply, I would point out that these same advocates were previously happy to point to the exact opposite (i.e zonal jet streams) as evidence of Global Warming, and that it seems to be more or less a post hoc rationalisation to then claim that the opposite: meandering jet streams, and the polar vortex that arises as a result of this regime, were now also a consequence of Global Warming. 

Now comes a paper from Blackport and Screen (2020) in Science Advances, which supports my conviction that this is an example of a reverse engineered explanation for events to help it fit in with the conventional narrative. 

The introduction from the paper:

”Whether Arctic amplification has contributed to a wavier circulation and more frequent extreme weather in midlatitudes remains an open question. 

For two to three decades starting from the mid-1980s, accelerated Arctic warming and a reduced meridional near-surface temperature gradient coincided with a wavier circulation. However, waviness remains largely unchanged in model simulations featuring strong Arctic amplification. 

Here, we show that the previously reported trend toward a wavier circulation during autumn and winter has reversed in recent years, despite continued Arctic amplification, resulting in negligible multi-decadal trends. Models capture the observed correspondence between a reduced temperature gradient and increased waviness on inter-annual to decadal time scales. However, model experiments in which a reduced temperature gradient is imposed do not feature increased wave amplitude

Our results strongly suggest that the observed and simulated co-variability between waviness and temperature gradients on inter-annual to decadal time scales does not represent a forced response to Arctic amplification.”

The allegedly anthropogenic, CO2 induced polar vortices were first described as early as 1853 (over 100 years prior to the “Anthropocene”). Whilst the Polar vortex phenomenon’s associated sudden stratospheric warming (SSW) was first discovered in 1952 (using radiosonde observations at altitudes higher than 20 km), it is important to note that radiosondes themselves were developed for the US Navy during WW2 (~1936-37) and they only came to be deployed for such observations some time after WW2 ended in 1945, and so it was observed first shortly after the measuring devices became available to actually measure it. Just as the attenuation of Antarctic ozone (known improperly as the “Ozone Hole”) was first observed in 1956 by Dobson, the very first time observations were made of the phenomenon. We clearly must live in a neo-Aristotelian era, where if we didn’t observe something, even if we didn’t have the means to look for or measure it, it apparently did not exist.

Climate Science under the microscope:

Given the numerous examples that I have elucidated above of those extreme weather events that were clearly caused by natural “climate change” that could not possibly have been caused by rising CO2 levels, because they predated any possible anthropogenic input with pre-1945 industrial CO2 production being indistinguishable from background fluxes. This begs the question: Why does Climate Change doctrine deny the presence of these extreme events and instead fixate on every post 1945 climate extreme event being anthropogenic to the exclusion of natural causes? To answer that question, one must address the psychology of Climate Change doctrine and the Climate Science that spawned it, in analysis of which there are clearly a few factors that need to be considered: 

1) Anyone who enters Climate Science as a discipline a priori is predisposed to a passion for the preservation of the environment, not just the study of it, and as a result would be unlikely to remain objective toward any facet of that study which they are “reliably” informed is a potential threat to that environment.

2) Climate Science is a relatively new discipline, with older climatologists historically deriving from other natural sciences and geography (as Dr Tim Ball and Hubert Lamb did). Climate Science 101 is that CO2 is the primary driver of temperature. It is a foundational belief that their whole discipline relies upon. That makes it very difficult, unless you are a heretic, to buck the system to the extent of questioning that “fact” regardless of how tenuous that belief might be. It is just accepted passively, because to not do so would see you looking for another career in double quick time. 

3) The financial and fame seeking distortion of thinking that derives from the highly politicised nature of Climate Science as it has evolved. There is now a considerable amount of prestige, fame and large financial grants coming through for what once was a backwater, fringe area of science, which would otherwise be nearly anonymous except for its current cause célèbre status. 

4) Noble cause corruption is a well known trap for the unwary, and the desire to work as an activist in a discipline that potentially saves the world from an existential crisis appeals to the nascent narcissism that lies at the heart of even the most modest and self-effacing researcher. 

5) Confirmation bias– in the pressure cooker environment of Climate Science as a political tool, the pressure (both internal and external) to ignore inconvenient facts that call the basic premise into question is high, whilst overweighting anything which confirms the premise, without looking at alternative explanations for the data in front of them.

6) Excessive reliance on computer modelling and non-predictive “climate projections” derived from these models, at the expense of objective assessment of the totality of climate observations, as the predicate for “action” on Climate Change. These computer simulations are only as accurate as the assumptions input into them, and represent a theoretical framework and are not an observation, nor a substitute for them.

The climate models have only been “successful” in Climate Science thus far if you:

1) cherry-pick your starting point (1980 rather than 1945, or 1880 rather than 1840, or 1000 AD or 40 BC or 2000 BC or 6000 BC);

2) if you have 73, or even 105 of them to choose from, all with different input parameters;

3) if you have fudge factors in built into them with aerosols, etc that can be modified at will to train them upon, and make them align with observations;

4) if you ignore that they fail completely on local and regional scales;

5) if you ignore that back testing of every model shows that the inputs must be altered to match the past. 

6) if you ignore that they cannot model significant aspects of climate, particularly cloud dynamics.

Clouds are admitted by mainstream climate science to be poorly understood and poorly modelled in those existing climate models they rely upon, precisely because the mechanisms that modify cloudiness are poorly understood. 

Cloud cover is not just a feedback to surface temperature, as is assumed under Anthropogenic Climate Change theory.

There are, for one thing, many different types of atmospheric particulates that can act as cloud condensation nuclei. Those particles may be composed of dust or clay, soot or black carbon from grassland or forest fires (or indeed from industrial sources), sea salt from ocean wave spray, sulphate from volcanic activity, phytoplankton, or by nuclei formed by cosmic radiation

For starters, minor fluctuations in phytoplankton blooms can influence cloud cover and albedo up to an impressive 10 Wm-2, just on their own. 

The ability of these different types of particles to form cloud droplets varies according to their size and also their exact composition.

As Nobel Prize winning physicist Freeman Dyson was quoted as saying prior to his recent death,

I have studied the climate models and I know what they can do. The models solve the equations of fluid dynamics, and they do a very good job of describing the fluid motions of the atmosphere and the oceans. They do a very poor job of describing the clouds, the dust, the chemistry and the biology of fields and farms and forests. They do not begin to describe the real world that we live in. 

The real world is muddy and messy and full of things that we do not yet understand. It is much easier for a scientist to sit in an air-conditioned building and run computer models, than to put on winter clothes and measure what is really happening outside in the swamps and the clouds. That is why the climate model experts end up believing their own models.”

Taking these above points into consideration, it is important to recognise that a consensus of like minded people, all agreeing with each other, no matter how “expert”, is no guarantee of truth. The mania, prevalent across most scientific disciplines, for scientists to obsess about the number of citations a paper has, or whether it is published in so called “high impact journals”, as some kind of guarantor of quality and/or truth is as disturbing as it is delusional. 

Truth and Knowledge is not a democracy. A complete lie believed by 90%+ of experts is still a lie, whilst the truth, known and having been proven empirically by one lone investigator, is still the truth regardless of how many fail to be persuaded by it, due to their entrained biases to the contrary.

In a similar vein, it should be noted that the much vaunted peer review system, when used to provide constructive criticism of scientific works, can be a welcome oversight to promote quality assurance. However, when applied as a selective gatekeeper, it can equally be used injudiciously and unfairly to filter out unpopular perspectives and research that goes against the grain of received wisdom. In this capacity, it acts as a negative drag on innovation and a promoter of groupthink within an increasing number of scientific fields that could well be done without. 

Peer review, as it has evolved over time, has become primarily interested in gatekeeping to entrench the prevailing wisdom. It is the governor that limits the speed of progress of scientific advancement, and acts to negate novel ideas and lateral thinking so that even the most mediocre of minds can keep up with the pace of progress.

Clearly, Isaac Newton didn’t need it, and Albert Einstein didn’t need it. They put their theories into the public forum to rise or fall according to its merits, without the need for their peers to protect them from scrutiny or criticism. Modern peer review, in a limited way has its place, but when unconstrained and over-valued it achieves the opposite of what it is intended to produce- innovative and quality scientific works that propel society forward toward a greater understanding of the natural world and the scientific principles that underpin that knowledge.

Conclusion:

Human beings and their activities, as even most sceptical people would acknowledge, certainly influence weather and temperature on a local scale due to altered land use, albedo changes, vegetation clearance and Urban Heat Island effects. Wind farms, for one such example, cut down local wind speeds and cause increased local fog and temperature rise in their immediate vicinity.

Globally, however, our climate impact on the planetary environment is dwarfed by processes like ocean cycles, which are also clearly beyond our control. Solar influence on climate, on the other hand, is both direct (solar spectral variations, EUV on ozone, the ionosphere) and indirect (on cloudiness, tropopause height, jet stream position and strength). Quantifying the exact impact of each of these is beyond the scope of our current knowledge, and the remit of this article.

Arctic sea ice, often claimed wrongly to be the canary in the coal mine for anthropogenic climate change, has fluctuated (and will continue to do so) throughout the Holocene interglacial. There were many ice free summers in the early parts of the Holocene interglacial period, and across several of the subsequent warming periods prior to this current Modern Warming Period we are currently experiencing. Ocean currents and under sea volcanism within the Arctic circle may actually have more relevance to this than any alleged rise in atmospheric temperature caused by man made CO2 emissions. As to land ice, the Little Ice Age was a period marked by huge glacial advances that were to grow to their greatest extent in 10,000 years during this era, and would be expected to decline in a subsequent warming period regardless of any cause or attribution. 

Sea temperatures, on the other hand, were highest in the Holocene Climate Optimum, and in modern times have increased as a function of high solar Short Wave Infrared Radiation input from the well above average intensity solar cycles in the years from 1950 to 2006. Importantly, it is the sun that heats the global oceans (maximally in the tropics and subtropics) via Short Wave Infrared Radiation which then heats the atmosphere, not vice versa.  That is why the air directly above the ocean surface is cooler over any 24hr period than the ocean surface waters, and why the ocean has a cool “skin”. Sea level rise can similarly be seen to be the result of coming out from the Little Ice Age due to the subsequent widespread glacier retreat, and from thermal expansion of the global oceans due to the increase in incident solar SWIR heating in recent decades.

Our inability to quantify the myriad of natural factors that shape our changing global climate, along with our propensity to view a complex stochastic system through a very limited perspective of carbon dioxide as the main driver of global climate changes, has led us to many false conclusions which simply ignore the climate of the past, and the clear precedents that give lie to the baseless assertion that it is man, and mankind alone, that shapes the current and future climate of the planet.

A final word…..

Climate Change alarmism has managed, through a concerted campaign of propaganda, distortions and obfuscations, plus the suppression of any and all dissent to their cause, to achieve the following over the last 30 years of more:

– convincing people that science is not actually a method, but what government approved “scientists” say is factual, and that those who question “The Science” are inherently evil, or in the pay of planet destroying enterprises.

– convincing a generation of children that even highly regulated capital enterprises are colonialist, racist and exploitative, when it has been capitalism that has elevated humanity out of its former hand to mouth existence and done more to reduce global poverty than any other single elemental force in the history of the human species.

– convincing the younger generations in Western liberal democracies that Marxist socialism is the only system of governance capable of preventing an environmental catastrophe, even though the track record of such societies are diametrically opposed to protection of the environment. Protection and stewardship of the environment are actually intrinsic to, and only derive from, societies such as ours in Western democracies, that are based along free market capitalism principles.

– persuading children that their future is fore-destined to be one of environmental catastrophe of mass species extinction and an uninhabitable planet, with the ensuing effect that has on their extreme levels of anxiety and unhappiness, along with the negative impacts on their self-esteem and any hopes they had for a bright and prosperous future.

– proposing “solutions” to an alleged environmental issue that not only undermine the ability of people to protect themselves from the adverse effects of the putative problem, but are also almost certain to promote widespread energy poverty and rationing, and the destruction of primary industry as it becomes uneconomic to produce the goods our society relies upon.

So much to be proud of.

Gain of Function: The True Proximal Origins of SARS CoV2?

SARS CoV2 Novel Coronavirus (Source: CDC)

The SARS CoV2 pandemic that has plagued the world from its very first documented cases in November/December 2019 in Wuhan, and spread far and wide across the entire globe over nearly three years since then, has been responsible at time of writing for over 600 million confirmed cases, and approaching upwards of 6.5 million deaths.

Given the severity of the death and destruction wrought by this one-in-a-hundred-year pandemic, the proximal origin of the beta-coronavirus responsible assumes a vital level of importance in understanding just how this catastrophe came into being. 

Without an accurate understanding of the virus’ origins, we are flying blind in hopefully preventing just this kind of biological disaster from recurring again and again into the future.

The People’s Republic of China owes the rest of the world a transparent investigation and forthright explanation of the exact sequence of events, including but not limited to identifying “patient zero” from whence this pandemic arose, in order to justify its place as a supposedly trusted and valued member of the global community.

The analysis of the response to this pandemic, on the other hand, which each nation has approached differently with varying degrees of misfortune and success, is a matter for each nation to soul search and pick apart for its own edification, to improve its various responses to any future pandemic. By contrast, these investigations have little or no bearing particularly on any other nation’s interests other than its own, valuable potential lessons notwithstanding.

Those answers the PRC would need to provide that are germane to the origins of the SARS CoV2 virus include:

a) if the virus is naturally derived– it then needs to be determined from where, how and why that zoonotic jump occurred, and how best to then prevent the sequence of events that followed that transmitted it subsequently to the human population, first locally and then abroad, and what measures could have been enacted to prevent its spread beyond the initial locale (Wuhan/Hubei province), and 

b) if the virus was laboratory derived, just what research was actually undertaken, what methodology was used and under what precautions, with what purpose and intent, and how did it then leak from either secure or unsecured laboratory facilities into the broader community surrounding, and then loco-regionally, and from there internationally.

These are vital, and I would suggest non-negotiable questions that need to be answered with honesty and integrity, and China is obligated as a global citizen to comply with reasonable requests for transparency and access to laboratory and medical records to provide these answers.  Sadly, quite the polar opposite has been the experience up to this point in time.

The Cover Story:

From the earliest days of the pandemic, the Chinese Communist Party, the World Health Organisation, and a closely aligned and self-reinforcing clique of prominent virologists with skin in the “Gain-of-Function”* game have been in desperate and concerted damage control, to not only to suppress any discussion of potential laboratory origins for SARS CoV2 (via either accidental or deliberate leak from the Wuhan Institute of Virology), but simultaneously they have been bending over backwards to promulgate far-fetched natural origin theories involving wet markets, in spite of a conspicuous lack of evidence for any intermediary host or hosts.

*NB: Gain-of-Function virological research is medical research that genetically alters an organism in a way that may enhance the biological functions of gene products. This may include an altered pathogenesis, transmissibility, or host range, i.e. altering the types of hosts that a microorganism can infect (with a focus on humans in particular), making it more infectious, or increasing its virulence and severity. This research is allegedly intended to reveal targets to better predict emerging infectious diseases, and to develop vaccines and therapeutics. Unfortunately, it is also ripe for exploitation and potential misuse in using such biological agents in acts of warfare, or else to be designed to undermine the peace and socio-economic prosperity of rival nations.

Wuhan ‘Wet Market’ (Source: Weibo)

The revulsion many Westerners feel about some of the practices in Chinese “wet markets” makes many readily accept that these perceived barbaric and inhumane practices must be the cause of the pandemic, purely based on the alleged morality of the behaviours involved. It is entirely possible to abhor these practices, or wish to curtail or discourage these inhumane activities, and still not treat it as a fait accompli that the wet markets must be the source of the SARS CoV2 virus.

Notwithstanding these emotive arguments, the evidence that it actually arose from those wet markets is fairly unconvincing. Unlike the preceding SARS 1 and MERS outbreaks, each being beta-coronavirus infections from recent history that most closely align with SARS CoV2, no intermediate host (civets in SARS 1, and camels in MERS) has been found despite an intensive search by Chinese authorities, that included the testing of more than 80,000 animals, including those in the wet markets alleged by some to have been the source of the outbreak.

Similarly, there has been no evidence whatsoever found of the SARS CoV2 virus having made multiple independent jumps from an intermediate host to people, as both the SARS1 and MERS viruses did. The intermediary host species of SARS1 was identified within 4 months of the epidemic’s outbreak, and the host of MERS was found within nine months. In spite of intensive searching by Chinese authorities, no intermediate host for SARS CoV2 has been found after nearly 3 years from the onset of the pandemic.

Neither has the SARS CoV2 virus been isolated from the alleged source bat populations found within very distantly situated caves, despite having been the source for similar bat coronaviruses known to have been encountered previously. The two closest known relatives of the SARS2 virus were collected from bats living in caves in Yunnan, which is a province in southern China. For the SARS2 virus to have spilled over into humans naturally, it should first have infected people living around the Yunnan caves, yet this was not what actually happened. The pandemic’s first cases broke out 1,500 kilometers away, in Wuhan.

There is actually no evidence at all from serological testing from blood samples of the civilian population elsewhere throughout China (not merely surrounding the caves in Yunnan), nor do hospital surveillance records from fever clinics demonstrate the epidemic gathering strength as the virus has been alleged to have evolved prior to those first cases in Wuhan in December 2019.

There is clearly no explanation as to how and why a naturally derived epidemic should break out initially in Wuhan and yet nowhere else, including those regions known to harbour bat colonies. The family of bat beta-coronaviruses to which SARS2 belongs, infect the horseshoe bat Rhinolophus affinis, a species that ranges across southern China. These bats have a range of 50 km, so they are unlikely to have made it to all the way Wuhan without the assistance of humans.

Rhinolophus affinis (Source: Nature.com)

Since the first cases of the Covid-19 pandemic probably occurred on or after September 2019, when temperatures in Hubei province are already cold enough to send bats into hibernation, this also makes natural transmission to humans at this time of year highly unlikely.

The theory of natural emergence theory therefore has little to recommend it, and becomes increasingly implausible given the sheer number of improbabilities at play that are required to accept it.


The Furin Cleavage Site, A Unique Feature of SARS CoV2:

No good explanation has been thus far posited as to how the SARS CoV2 virus acquired its furin cleavage site, a feature which no other SARS-related beta-coronavirus possesses, nor why the furin cleavage site is composed of human-preferred codons, rather than genetic sequences commonly associated with bats.

From Nicholas Wade:

In the case of SARS1, researchers have documented the successive changes in its spike protein as the virus evolved step by step into a dangerous pathogen. After it had gotten from bats into civets, there were six further changes in its spike protein before it became a mild pathogen in people. After a further 14 changes, the virus was much better adapted to humans, and with a further 4 the epidemic took off.

But when you look for the fingerprints of a similar transition in SARS2, a strange surprise awaits. The virus has changed hardly at all, at least until recently. From its very first appearance, it was well adapted to human cells. Researchers led by Alina Chan of the Broad Institute compared SARS2 with late stage SARS1, which by then was well adapted to human cells, and found that the two viruses were similarly well adapted. “By the time SARS-CoV-2 was first detected in late 2019, it was already pre-adapted to human transmission to an extent similar to late epidemic SARS-CoV,” they wrote.

Even those who think lab origin unlikely agree that SARS2 genomes are remarkably uniform. World leading authority on bat coronaviruses, Dr. Ralph Baric writes that “early strains identified in Wuhan, China, showed limited genetic diversity, which suggests that the virus may have been introduced from a single source.” The uniform structure of SARS2 genomes gives no hint of any passage through an intermediate animal host, and no such host has been identified in nature.

All the other viruses have their S2 unit cleaved at a different site and by a different mechanism. Mutation seems a less likely way for SARS2’s furin cleavage site to be generated, even though it can’t completely be ruled out. The site’s four amino acid units are all together, and all at just the right place in the S1/S2 junction. Mutation is a random process triggered by copying errors (when new viral genomes are being generated) or by chemical decay of genomic units. So it typically affects single amino acids at different spots in a protein chain. A string of amino acids like that of the furin cleavage site is much more likely to be acquired all together through a quite different process known as recombination.

Compared with the RaTG13 coronavirus (alleged from the Wuhan Institute of Virology database to have ~96% homology with the COVID 19 virus), SARS CoV2 has a 12-nucleotide PRRA insert that occurs right at the S1/S2 junction of the Spike protein. The insert is in the sequence of T-CCT-CGG-CGG-GC. The CCT codes for Proline, the two CGG’s for two Arginines, and the GC is the beginning of a GCA codon that codes for Alanine. The CGG sequence is an uncommon (~5%) codon for Arginine (Human cells can designate Arginine with the codons CGT, CGC or CGG with the latter the least common by far), and the presence of two rare sequences together at the S1/S2 junction on the Spike protein is a combination highly unlikely to occur via natural emergence, especially when this sequence does not occur in any other beta-coronavirus, and by coincidence favours human cell penetration rather than bat or even pangolin cells.

Two gain-of-function modifications of bat CoV RaTG13 promoting transmission to humans (Source: ACS medicinal Chemistry Letters)

As a consequence of this insert at the S1/S2 junction, SARS CoV2 (as it was genomically sequenced at the onset of its emergence in December 2019) only has a very minor affinity for bat cells, making it a dubious claim that it was ever even derived, at least naturally, from bats in its initial form.

As Liu et al showed, ” the insertion of PRRA into the RaTG13 S protein selectively abrogated the usage of horseshoe bat and pangolin ACE2, but enhanced the usage of mouse ACE2 by the relevant pseudo-virus to enter cells.” (Note: the Angiotensin-converting enzyme 2 (ACE2) is not only an enzyme but also a functional receptor on cell surfaces through which SARS-CoV-2 enters the host cells and is highly expressed in the heart, kidneys, and lungs and shed into the plasma.)

As Liu et al. further state:

The four-residue insert (PRRA) at the boundary between the S1 and S2 subunits of SARS-CoV-2 has been widely recognized since day 1 for its role in SARS-CoV-2 S protein processing and activation. As this PRRA insert is unique to SARS-CoV-2 among group b betacoronaviruses, it is thought to affect the tissue and species tropism of SARS-CoV-2. We compared the usages of 10 ACE2 orthologs and found that the presence of PRRA not only affects the cellular tropism of SARS-CoV-2 but also modulates the usage of ACE2 orthologs by the closely related bat RaTG13 S protein. The binding of pseudo-virions carrying RaTG13 S with a PRRA insert to mouse ACE2 was nearly 2-fold higher than that of pseudo-virions carrying RaTG13 S without the PRRA insert.

In other words, the PRRA insert at the S1/S2 junction site modulates cellular tropism (i.e. the range of host cells the virus can infect) of the SARS-CoV-2, and governs its species specific ACE2 receptor usage. 

The Cover Up:

From the very earliest days of the COVID 19 pandemic, the potential for a leak or deliberate release of the SARS CoV2 coronavirus from the Wuhan Institute of Virology was recognised by some, but this hypothesis has been denied, disparaged, obfuscated, “debunked” (allegedly) or otherwise discredited from all directions, but universally this relentless negativity has come from those with a vested interest in quietly burying any suggestion of negligence or malfeasance on the part of various governments, institutions or individuals.

With almost unseemly (and somewhat telling) haste, the virology community went into damage control, with the most prominent and influential example being the Kristian G. Andersen et al (2020) paper “The proximal origin of SARS-CoV-2” in Nature Medicine, which made the extraordinary claim that it was “irrefutable” that the SARS CoV2 virus could not have originated in a laboratory.

This scientific paper, from researchers at the Scripps Research Institute in La Jolla, California, was touted as disproving the laboratory origins of SARS CoV-2, but has some interesting quotes that belie that certainty of that claim and deserve greater scrutiny, as a blog comment I made on 26 April 2020, reproduced below, demonstrates:

Quote 1: “It is improbable that SARS-CoV-2 emerged through laboratory manipulation of a related SARS-CoV-like coronavirus. As noted above, the RBD of SARS-CoV-2 is optimized for binding to human ACE2 with an efficient solution different from those previously predicted. Furthermore, if genetic manipulation had been performed, one of the several reverse-genetic systems available for betacoronaviruses would probably have been used

However, the genetic data irrefutably show that SARS-CoV-2 is not derived from any previously used virus backbone. 

Instead, we propose two scenarios that can plausibly explain the origin of SARS-CoV-2: (i) natural selection in an animal host before zoonotic transfer; and (ii) natural selection in humans following zoonotic transfer. 

We also discuss whether selection during passage could have given rise to SARS-CoV-2.”  

Rebuttal 1: To crystallise the researchers argument down, the authors suggest that it COULD NOT have been made in a laboratory because: 

a) the binding of the spike protein to the ACE receptor is not how they would have “previously predicted”. 

This presupposes that Chinese virologists share their data and research openly and that the entire virology community restricts themselves to “predicted” methodology. Knowing the Chinese Communist Party and their propensity for the utmost secrecy in their approach to R&D, particularly in the prospective development of bioweapons, that claim stretches credibility to breaking point. 

b) that the virus backbone itself doesn’t conform to a sequence known to Western virologists. This is alleged to be “irrefutable”. On the contrary: There is no shared database of the genomic structure of viruses that is “open-source” and described comprehensively in the literature. 

Also, no one knows precisely what viruses are stored in China’s Wuhan BSL-4 laboratory, particularly since 2015 when the CCP launched a ramping up of such research for enhancing their biological “defence“ capabilities. 

The authors also say some curious things that cast doubt on their assumptions: 

Quote 2: “Given the level of genetic variation in the spike, it is likely that SARS-CoV-2-like viruses with partial or full polybasic cleavage sites will be discovered in other species.” 

Rebuttal 2: This is classic wishful thinking dressed up as science. “Likely” is a word that substitutes conveniently for “we hope we might find evidence of this some day”. It is in fact a tacit admission that they have no natural precedent to point to for their claim that natural derivation of SARS CoV-2 was “irrefutable”. 

Quote 3: “The acquisition of polybasic cleavage sites by HA has also been observed after repeated passage in cell culture or through animals”

Rebuttal 3: This is also an admission of sorts that such research has been done previously, but tellingly without detailing the known precedents of gain of function through serial passage through generations of ferrets, or passage through tissue culture on human cell lines for an extended period of time in the laboratory, precedents of which the authors could not help but be aware.

Quote 4: “For a precursor virus to acquire both the polybasic cleavage site and mutations in the spike protein suitable for binding to human ACE2, an animal host would probably have to have a high population density (to allow natural selection to proceed efficiently) and an ACE2-encoding gene that is similar to the human ortholog.” 

Rebuttal 4: This could just as easily, if not more so, have been achieved using tissue culture in human cell lines or through serial generations of ferrets in the laboratory, which allows not-so-natural selection to proceed even more efficiently than nature could possibly hope to emulate.

Quote 5: “Subsequent generation of a polybasic cleavage site would have then required repeated passage in cell culture or animals with ACE2 receptors similar to those of humans, but such work has also not previously been described.” 

Rebuttal 5: Except it has, in multiple papers using serial passage through ferrets; notably Imai et al (2012) in Nature, and Herfstu et al (2012) in Science. 

The other claim in the Andersen et al. paper that is speculation dressed up as conclusive fact is that any manipulation of the coronavirus genome would be visible to expert virologists, but that is not remotely true, let alone irrefutable. Dr. Ralph Baric of the University of North Carolina had patented a new method in 2002 for assembling the genome of viruses, allowing the creation of new pathogens without leaving any visible traces of laboratory intervention.

Baric called this new technology the “no-see-ums method” and he claimed it had “wide and largely underestimated applications in molecular biology.” Through this essentially invisible revision and reassembly of viral genomes, it would be impossible to determine if any virologist had interfered with the genome after using Dr. Ralph Baric’s technology.*

* Dr Ralph Baric’s application for his invention was entitled “Methods for producing recombinant coronavirus” (No. 7279327). Importantly, in respect of the denials in the Andersen paper, the patent describes a method for targeted modification of viruses of the families Coronaviridae (i.e. Coronaviruses) and Arteriviridae.

“Claims made in the Andersen et al. paper seem unaware of the known precedent from Russia dating back to 1970s, where H1N1 Bird Flu leaked from a Soviet Lab and was ultimately considered to be the product of laboratory manipulation,  “since it was very similar to strains of H1N1 that hadn’t been in circulation for decades, and seemed to be the product of sequential passage in an animal reservoir, which was determined by its vast genetic distance from any other present strain of flu.”

The Andersen paper also ignores the fact that SARS 1 has leaked out of other Chinese labs on at least four separate occasions that we know of in recent years, and that the Chinese Virology lab at Wuhan were known to have been heavily involved with dual-use gain-of-function research, swapping around viral genomes in the lab to try to create the most virulent strain possible, research so troubling that the US National Institutes of Health banned the research from 2015 to 2017 before strangely lifting the ban again in 2018.

Ferrets share with humans a high level of affinity with the spike protein of SARS-CoV-2 on their ACE2 receptors, making them an ideal laboratory reservoir to pass down several generations to make the virus appear naturally derived. Additionally, studies examining COVID-19’s infectivity in ferrets found that it spreads readily among them, and also appears airborne in that animal model, lending support to the idea that ferrets could have been used for serial passage.

Those claiming this could not have originated artificially in laboratories actively researching these viruses, are dissembling and/or lying by omission at the very least.

There is more than a little fear in the global virology community that their gain of function research might be blamed for a global pandemic that was likely, even at that early stage, to cost millions of lives. That is strong motivation to “believe” in (or perhaps more accurately hope for) a natural derivation for the SARS CoV2 virus.

We only have the lead Wuhan Institute of Virology bat coronavirus researcher Zhengli Shi’s assurances that the SARS CoV2 genome doesn’t match anything in her lab. Her “discovery”, RaTG13 is alleged to have 96% homology with the SARS CoV2 virus, but in spite of allegedly being discovered in 2013, its discovery was not announced to the world until 1/2020, after the SARS CoV2 outbreak had occurred, and it has not been independently verified to even exist outside of an RNA sequence on a computer.

Zhengli Shi and Researcher in the Biosafety Level 4 Laboratory at Wuhan Institute of Virology, Hubei province, Feb. 23, 2017 (Source: Johannes Eisele/VCG)

From Steven Mosher, who asserts that Zhengli Shi has indeed fabricated the genomic structure of the nearest known relative of SARS CoV2 (as RaTG13 is alleged to be) as cover for the natural evolution theory, and that it likely exists merely as a data sequence on her computer:

“In nature, the ratio of synonymous to non-synonymous (mutations)* is approximately 5:1.

Here’s where Dr. Shi got into trouble.  When typing in the genomic sequence of her “discovery” she made way too many non-synonymous changes at the beginning.  Then, one-third of the way through the sequence, she apparently realized her error.  After that, she made way too few non-synonymous changes.  So while the entire genome has the expected 5:1 ratio, there are stretches where the ratio is closer to 2:1, and other long stretches where it is as high as 44:1.” (i.e. they were not randomly distributed, and therefore a tell tale sign of fabrication)

Nature’s mutations are random.  Dr. Shi’s “mutations” are not. Dr. Lawrence Sellin calculates that the odds that her “mutations” occurred naturally in just one area—the critical spike protein–at almost ten million to one.”

*Note: Synonymous mutations do not change the amino acid sequence, while nonsynonymous mutations change the amino acid sequence. So, this is the key difference between synonymous and nonsynonymous mutation. Moreover, synonymous mutations are functionally silent and evolutionary neutral whereas nonsynonymous mutations are evolutionary important.

The virology records at Wuhan Institute of Virology, including the work of the “bat lady” Zhengli Shi, have since been destroyed conveniently at the behest of the CCP in the aftermath of the outbreak. That doesn’t sound, on the face of it, like the Chinese authorities believe they have nothing to hide.

In fact, after China’s National Health Commission issued the “No. 3 document“, the Wuhan Institute of Virology of the Chinese Academy of Sciences was required to stop pathogen detection, stop experimental activities and destroy existing samples.

I would also like to suggest that the U.S authorities may also not be keen for this to be found to have originated from a laboratory, as they have had an active hand in funding it and have collaborated with Chinese virologists on such Gain-of-Function work, and it would almost certainly put paid any similar biological research they might be undertaking, leaving them at a disadvantage to their Chinese counterparts who will no doubt return to this dangerous research when the controversy over SARS CoV2 blows over.

Of course, the credulous amongst us will just allow this highly dangerous biological Gain of Function research to continue unchecked by their willingness to take comfort from one hastily written rebuttal, conveniently timed I might add, where the authors had zero access to the Wuhan laboratory records to confirm their prognostications.”

Curiouser and Curiouser, Alice (A Trip Down the Rabbit Hole):

With a global pandemic likely to cause several million deaths, and the loss of $trillions in economic damage, it is indeed curious that prominent virologists neglect to mention such inconvenient truths as the Herfstu et al (2012) paper, where a highly pathogenic avian influenza virus was passed through a ferret population (noting that they have significant receptor homology to humans) to render it transmissible from person to person. This particular research was so troubling that the U.S National Institute of Health banned dual use, gain of function research like this from 2014-2017, because it was ascertained that if that particular avian influenza virus had escaped from the laboratory the estimated mortality rate could have been upwards of 60%!

Yet, the authors in Andersen et al. fail to mention this, or reference this highly relevant study in their reference list. 

There is not much motivation in the virology sector to cast such serious aspersions on the research they are all currently undertaking, no matter how dangerous it might be, to the detriment of their future research funding, not to mention the legal and economic ramifications for former colleagues and members of the virology community in general should it ever be determined to have been man made in a laboratory rather than naturally derived, even if it was released into the community by unfortunate accident.

Also, from the Menachery, et al (2015) paper (Dr Ralph Baric was one of the co-authors) that I referenced above, entitled: 

“A SARS-like cluster of circulating bat coronaviruses shows potential for human emergence” 

– comes the following interesting snippet worth pondering as a precedent involving altering SARS 1 to enhance its infectivity to humans: 

<<Using the SARS-CoV reverse genetics system, we generated and characterized a chimeric virus expressing the spike of bat coronavirus SHC014 in a mouse-adapted SARS-CoV backbone. 

The results indicate that group 2b viruses encoding the SHC014 spike in a wild-type backbone can efficiently use multiple orthologs of the SARS receptor human angiotensin converting enzyme II (ACE2), replicate efficiently in primary human airway cells and achieve in vitro titers equivalent to epidemic strains of SARS-CoV. 

Additionally, in vivo experiments demonstrate replication of the chimeric virus in mouse lung with notable pathogenesis. 

Evaluation of available SARS-based immune-therapeutic and prophylactic modalities revealed poor efficacy; both monoclonal antibody and vaccine approaches failed to neutralize and protect from infection with CoVs using the novel spike protein. >>

This paper shows a number of interesting things: 

a) that virologists are actively generating chimeric bat coronaviruses in the laboratory with variations of the SARS-CoV1 backbone in combination with various different spike protein variations; 

b) that the pathogenesis of these chimeric viruses can be enhanced through this methodology; and

c) that in this instance at least, the novel spike protein significantly reduced the efficacy of antibody protection and/or vaccination to prevent subsequent infection.

This latter would suggest that the much hoped for development of a vaccination against SARS CoV 2 may not be fully efficacious at preventing subsequent infection, even if the vaccine is “successfully” developed.

Curiosity piqued by the White Rabbit still further down the rabbit hole:

“A Reconstructed Historical Aetiology of the SARS-CoV-2 Spike” Birger Sørensen, Angus Dalgleish & Andres Susrud (2020): 

Abstract: To discover exactly how to attack SARS-CoV-2 safely and efficiently, our vaccine candidate Biovacc-19 was designed by first carefully analysing the biochemistry of the Spike. We ascertained that it is highly unusual in several respects, unlike any other CoV in its clade. 

The SARS-CoV-2 general mode of action is as a co-receptor dependent phagocyte. But data shows that simultaneously it is capable of binding to ACE2 receptors in its receptor binding domain

In short, SARS-CoV-2 is possessed of dual action capability. In this paper we argue that the likelihood of this being the result of natural processes is very small. The spike has six inserts which are unique fingerprints with five salient features indicative of purposive manipulation

We then add to the bio-chemistry a diachronic dimension by analysing a sequence of four linked published research projects which, we suggest, show by deduction how, where, when and by whom the SARS-CoV-2 Spike acquired its special characteristics. 

This reconstructed historical aetiology meets the criteria of means, timing, agent and place to produce sufficient confidence to reverse the burden of proof. Henceforth, those who would maintain that the Covid-19 pandemic arose from zoonotic transfer need to explain precisely why this more parsimonious account is wrong before asserting that their evidence is persuasive, most especially when, as we also show, there are puzzling errors in their use of evidence. 

The conclusion: The Evidence Suggests That This Is No Naturally Evolved Virus.

Another interesting, not altogether unrelated tidbit worthy of consideration:

“COVID19 has an affinity for the ACE2 receptor 10-20 times higher than SARS 1, and it also creates viral loads thousands of times higher than SARS1.” 

These two characteristics point towards COVID-19 using antibody-dependent enhancement, or ADE, to enter human cells. 

This is when the virus is able to hijack white blood cells or other antiviral proteins to more easily enter into the rest of our body’s cells, allowing it to seep deep into its hosts’ nervous systems, creating permanent neurological damage in the hosts it doesn’t kill outright. 

Supporting evidence that ADE is occurring with COVID-19 infection can be found in a paper from April 2020, which indicated that the novel Coronavirus is targeting two different types white blood cells incredibly efficiently, especially compared to SARS which hardly binds to white blood cells at all. Since those types of cells have almost no ACE2 receptors, it appears that COVID-19 is using another method to efficiently enter those cells – with ADE at the top of the list of possibilities, although it’s not demonstrated here. 

(Reprinted from “Coronavirus Replication Cycle”, by BioRender.com (2020). Retrieved from https://app.biorender.com/biorender-templates)

The aforementioned Zhengli Shi, of UNC and Wuhan Institute of Virology fame, co-authored a 2019 paper which used inert viral shells to figure out exactly how SARS 1, with its affinity to the ACE2 receptor just like COVID-19, was able to harness ADE to hijack white blood cells for enhanced cell entry. She and other associates were actively developing chimeric SARS-like Coronaviruses at the Wuhan Virology Lab, using the highly problematic dual use, gain of function research techniques, and were actively recruiting researchers specifically for this in late 2019.

A gain-of-function extension of this research would be exactly the kind of experiment that could’ve given birth to SARS CoV2, especially considering that 2019 paper managed to fine-tune the exact concentration of antibodies that would best facilitate ADE.”

Also of some note for the curious, via BioRxiv, the pre-print server for Biology from 30 April, 2020:

“Spike Mutation Pipeline Reveals the Emergence of a More Transmissible Form of SARS-CoV-2”

<<To date we have identified fourteen mutations in Spike that are accumulating………The mutation Spike D614G is of urgent concern: it began spreading in Europe in early February, and when introduced to new regions it rapidly becomes the dominant form.>>

The implications for this are that these mutated strains may have different degrees of infectivity, have differing morbidity and mortality rates (though this remains to be seen), and this has implications for the possibility of antigenic shift, the validity of antibody protection, the potential for re-infection, and also the potential efficacy of any vaccine that might be developed, even if one is eventually found to be effective for some strains of this virus.

A deeper dive:

Professor Giuseppe Tritto is an internationally known expert in biotechnology and nanotechnology who in addition to a stellar academic career, is also the president of the World Academy of Biomedical Sciences and Technologies (WABT), an institution founded under the aegis of UNESCO in 1997. One of the goals of WABT is to analyze the effect of biotechnologies—like genetic engineering—on humanity.

Professor Tritto has just launched a new book, available only in Italian at this stage, called “Cina COVID 19: La Chimera che ha cambiato il Mondo” (“China COVID 19: The chimera that changed the world”), in which he outlines the evidence that the SARS CoV2 virus was indeed synthesised in a laboratory, and not naturally derived.

From LifeSite News:

‘What sets Prof. Tritto’s book apart is the fact that it demonstrates—conclusively, in my view—the pathway by which a PLA-owned coronavirus was genetically modified to become the China Virus now ravaging the world. His account leaves no doubt that it is a “chimera”, an organism created in a lab.

In vaccine development, reverse genetics is used to create viral strains that have reduced pathogenicity but to which the immune system responds by creating antibodies against the virus. But reverse genetics can also be used to create viral strains that have increased pathogenicity. That is what Dr. Shi (Zhengli), encouraged by PLA bioweapons experts, began increasingly to focus her research on, according to Prof. Tritto.

Dr. Shi first solicited help from the French government, which built the P4 lab, and from the country’s Pasteur institute, which showed her how to manipulate HIV genomes. The gene insertion method used is called “Reverse Genetics System 2.” Using this method, she inserted an HIV segment into a coronavirus discovered in horseshoe bats to make it more infectious and lethal. 

The U.S. was involved as well, particularly Prof Ralph S. Baric, of the University of North Carolina, who was on the receiving end of major grants from the National Institute of Allergy and Infectious Disease. This is, of course, Dr. Anthony Fauci’s shop. Dr Fauci was a big proponent of “Gain-of-Function” research, and when this was prohibited at Ralph Baric’s lab because it was considered to be too dangerous, the research was then shifted on to China.

Prof. Tritto believes that, while Dr. Shi’s research began as an effort to develop a vaccine against SARS, it gradually morphed into an effort to use “reverse genetics” to build lethal biological weapons. This was the reason that the Wuhan lab became China’s leading center for virology research in recent years, attracting major funding and support from the central government.

When asked why China has refused to provide the complete genome of the China Virus to the WHO or to other countries, Dr. Tritto explained that “providing the matrix virus would have meant admitting that SARS-CoV-2 [China Virus] was created in the laboratory. In fact, the incomplete genome made available by China lacks some inserts of AIDS amino acids, which itself is a smoking gun.”

The key question, for those of us who are living through the pandemic, concerns the development of a vaccine. On this score, Prof. Tritto is not optimistic: Given the many mutations of SARS-CoV-2, it is extremely unlikely that a single vaccine that blocks the virus will be found. At the moment 11 different strains have been identified: the A2a genetic line which developed in Europe and the B1 genetic line which took root in North America are more contagious than the 0 strain originating in Wuhan. I therefore believe that, at the most, a multivalent vaccine can be found effective on 4-5 strains and thus able to cover 70-75% of the world’s population.” 

Dissident Chinese virologist Li-Meng Yan has just posted a link to a paper she co-authored on the origins of SARS CoV2 entitled:

“Unusual Features of the SARS-CoV-2 Genome Suggesting Sophisticated Laboratory Modification Rather Than Natural Evolution and Delineation of Its Probable Synthetic Route.”

She demonstrates that the S2 component of the Spike protein has 95% sequence identity with 2 existing bat coronaviruses, but the S1 component (which contains the receptor binding motif (RBM) determining which host the Spike protein can infect) has only 69% amino acid sequence identity with existing coronaviruses. 

This effectively refutes the natural derivation by either convergent evolution or recent natural recombination event due to the large discrepancy between these two portions of the S1 and S2 Spike protein sequences.

Evidence presented in this part reveals that certain aspects of the SARS-CoV-2 genome are extremely difficult to reconcile to being a result of natural evolution. The alternative theory we suggest is that the virus may have been created by using ZC45/ZXC21 bat coronavirus(es) as the backbone and/or template. 

The Spike protein, especially the RBM within it, should have been artificially manipulated, upon which the virus has acquired the ability to bind hACE2 and infect humans. This is supported by the finding of a unique restriction enzyme digestion site at either end of the RBM. An unusual furin-cleavage site may have been introduced and inserted at the S1/S2 junction of the Spike protein, which contributes to the increased virulence and pathogenicity of the virus. 

These transformations have then staged the SARS CoV-2 virus to eventually become a highly-transmissible, onset-hidden, lethal, sequelae-unclear, and massively disruptive pathogen. 

Evidently, the possibility that SARS-CoV-2 could have been created through gain-of-function manipulations at the WIV is significant and should be investigated thoroughly and independently.

Finally, from February 2021 comes this assessment:

Via Wiley online open access:

“The genetic structure of SARS‐CoV‐2 does not rule out a laboratory origin” Segreto and Deigin (2020)

This paper sets out in detail why arguments in the Andersen et al. (2020) paper are not only not cut and dried, but are out of date viz:

The main argument brought by the authors is that the high‐affinity binding of the SARS‐CoV‐2 spike protein to hACE2 could not have been predicted by models based on the RBD of SARS‐CoV. Based on the structural analysis conducted by Wan et al., SARS‐CoV‐2 has the potential to recognize hACE2 more efficiently than the SARS‐CoV, which emerged in 2002. 

Moreover, generation of CoV chimeric strains has recently demonstrated that bat CoV spikes can bind to the hACE2 receptor with more plasticity than previously predicted. All amino acids in the RBD have been extensively analyzed and new models to predict ACE2 affinity are available.

Or failing to account for well known research in the field viz:

…….creation of chimeric viruses has been carried out over the years with the purpose of studying the potential pathogenicity of bat CoVs for humans. In this context, SARS‐CoV‐2 could have been synthesized by combining a backbone similar to RaTG13 with the RBD of CoV similar to the one recently isolated from pangolins, because the latter is characterized by a higher affinity with the hACE2 receptor. Such research could have aimed to identify pangolins as possible intermediate hosts for bat‐CoV potentially pathogenic for humans. Subsequent serial cell or animal passage, as described by Sirotkin & Sirotkin could have provided the perfect adaptation of the RBD to the hACE2.

Strangely, they go on to make very similar arguments to mine in relation to that paper, which was characterised by sweeping statements not backed up with evidence, glaring omissions of foundational studies in the field, and rhetorical techniques designed to guide those reading it to a conclusion of which they had no reason to be so certain.

The conclusion to the paper is instructive for those with an open mind:

On the basis of our analysis, an artificial origin of SARS‐CoV‐2 is not a baseless conspiracy theory that is to be condemned and researchers have the responsibility to consider all possible causes for SARS‐CoV‐2 emergence. The insertion of human‐adapted pangolin CoV RBD obtained by cell/animal serial passage and furin cleavage site could arise from site‐directed mutagenesis experiments, in a context of evolutionary studies or development of pan‐CoV vaccines or drugs. 

A recent article in Nature affirms that a laboratory origin for SARS‐CoV‐2 cannot be ruled out, as researchers could have been infected accidentally, and that gain‐of‐function experiments resulting in SARS‐CoV‐2 could have been performed at WIV. Genetic manipulation of SARS‐CoV‐2 may have been carried out in any laboratory in the world with access to the backbone sequence and the necessary equipment and it would not leave any trace. Modern technologies based on synthetic genetics platforms allow the reconstruction of viruses based on their genomic sequence, without the need of a natural isolate

A thorough investigation on strain collections and research records in all laboratories involved in CoV research before SARS‐CoV‐2 outbreak is urgently needed. Special attention should be paid to strains of CoVs that were generated in virology laboratories but have not yet been published, as those possibly described in the deleted WIV database. Because finding a possible natural host could take years, as with the first SARS, or never succeed, equal priority should be given to investigating natural and laboratory origins of SARS‐CoV‐2. 

Xiao Qiang, a research scientist at Berkeley, recently stated: “To understand exactly how this virus has originated is critical knowledge for preventing this from happening in the future.”

Conclusion:

The assertion by the Virology community that the genetic makeup of SARSCoV2 was somehow “incompatible with lab origin” was a leap of logic that was completely bogus. It remains a desperate attempt to deflect from the dangerous Gain-of-Function research that both the U.S and China had been engaged in. 

I and others have shown that the published assessment that first claimed to be irrefutable- Andersen et al. (2020) – ignored the most pre-eminent bat coronavirus expert in the world, Dr Ralph Baric’s own research paper in 2008 which showed how chimeric viruses could be created without any detectable human manipulation being evident.

Anderson et al. also avoids citing that and other well known papers mentioned above that contradict their position, and then boldly states that because there were alleged to be no detectable features of human manipulation that it COULD NOT be man made. 

Many of these researchers were either engaged in this kind of research themselves, or had relationships professionally with those that do, both in the US and China. They are hardly likely to point the finger of blame for a global pandemic with millions of deaths on colleagues or associates or themselves given all the adverse consequences that ensued, and the recriminations that would likely cause for them personally, and for their careers.****

**** For a comprehensive (and anger inducing) analysis of the politics behind the bat coronavirus gain-of-function research, the behind closed doors controversy in scientific circles, and how a closely aligned clique of like minded G-o-F virologists commandeered the “science”– the following link to an article published in 2024, some time after this post was written, gives background reinforcing much of what has been posted above, with added context going back to 2011 and a paper by Ron Fouchier, and a subsequent paper by Yoshihiro Kawaoka in 2014, and the responses of an organisation called “Scientists for Science” to this controversial research that set the stage for outsourcing such research to Wuhan, and then covering up the egregious consequences of their reckless actions to save their own skins from an outraged public – https://brownstone.org/articles/the-boys-will-be-boys-of-science/

Of course, a three hour tour by the WHO investigators 1 year after all the evidence has been destroyed by the Chinese authorities, could not hope to confidently and authoritatively exclude the laboratory origin hypothesis, yet such a cursory examination was somehow deemed sufficient to “resolve” the question.

Yet, implicit in the recent accusations by Chinese Communist Party officials that SARS CoV2 was likely introduced into China by imported frozen meat, or the previous assertion that it was brought in by US soldiers attending the World Military Games in Wuhan, or else in any future claim with any other desperate spin that the CCP may like to dream up, is that there clearly cannot exist any direct evidence that they have been able to find to support the natural zoonotic origin and spread narrative.

The absence of evidence of plausible intermediary evolutionary progenitors in either animal or human reservoirs, nor any serological evidence from hundreds of fever clinic blood samples of similar viral illnesses in the months leading up to it, nor evidence (unlike SARS 1- Civets, or MERS- Camels) of retrospective serological evidence in the putative zoonotic source (noting that all Pangolins tested were negative), then allows them to make the claims they have as any of that evidence being present would immediately contradict those baseless assertions. 

The fact that such evidence does not exist, after nearly 3 years after the outbreak began, and in spite of extensive searching and a recent WHO investigation, would lead any one to conclude that a natural zoonotic cause was the least likely option, and that the Chinese authorities are more than aware of that lack of evidence and spinning desperately as a consequence.

In gambling parlance, it is what the professionals refer to as a “tell”.

Gain of Function Research and the Mad Hatter’s Tea Party:

Eventually, as curiosity leads one further down the rabbit hole of the source of this pandemic, where the possibility of a laboratory leak of a chimeric man made virus becomes distinctly more likely, one inevitably is led to Dr Anthony Fauci and Dr Peter Daszak, the Mad Hatter and the March Hare in this little adventure in the topsy turvy world of Wonderland. Their potential role in promoting and funding what is euphemistically called “Gain-of-Function” research (engineering viruses that are more pathogenic and deadly), an inherently dangerous branch of Virology that is ripe for opening up the Mad Hatter’s Tea Party of a world of designer bioweapons and biological warfare, is more than worthy of detailed scrutiny.

Anthony S. Fauci, director of the National Institute of Allergy and Infectious Diseases, listens as President Donald Trump participates in a vaccine development event at the White House on May 15, 2020. (Source: Jabin Botsford/The Washington Post)

Via The Intercept:

According to documents obtained from the U.S National Institute of Health (NIH):

“Understanding the Risk of Bat Coronavirus Emergence” (Principal Investigator Peter Daszak)- Awarded $666,442.00 grant on 27/5/2014

The Wuhan Institute of Virology and the nearby Wuhan University Center for Animal Experiments, along with their collaborator, the U.S.-based nonprofit EcoHealth Alliance, have engaged in what the U.S. government defines as “gain-of-function research of concern,” intentionally making viruses more pathogenic or transmissible in order to study them, despite stipulations from a U.S. funding agency that the money not be used for that purpose.

Grant money for the controversial experiment came from the National Institutes of Health’s National Institute of Allergy and Infectious Diseases, which is headed by Anthony Fauci. The award to EcoHealth Alliance, a research organization which studies the spread of viruses from animals to humans, included subawards to Wuhan Institute of Virology and East China Normal University. The principal investigator on the grant is EcoHealth Alliance President Peter Daszak, who has been a key voice in the search for Covid-19’s origins.

A paper Dr Anthony Fauci wrote for the American Society for Microbiology in October 2012 in which he argued in support of gain-of-function research. Such research involves making viruses more infectious and/or deadly. Various experts in the field have raised the possibility that the COVID-19 pandemic could have originated from a potential lab leak at the Wuhan Institute of Virology in Wuhan, China, where gain-of-function experiments on bat coronaviruses have been conducted.”

Despite the risks involved, Fauci called gain-of-function experiments “important work” in his 2012 writing:

“In an unlikely but conceivable turn of events, what if that scientist becomes infected with the virus, which leads to an outbreak and ultimately triggers a pandemic? Many ask reasonable questions: given the possibility of such a scenario – however remote – should the initial experiments have been performed and/or published in the first place, and what were the processes involved in this decision?

Scientists working in this field might say – as indeed I have said – that the benefits of such experiments and the resulting knowledge outweigh the risks. It is more likely that a pandemic would occur in nature, and the need to stay ahead of such a threat is a primary reason for performing an experiment that might appear to be risky.

Within the research community, many have expressed concern that important research progress could come to a halt just because of the fear that someone, somewhere, might attempt to replicate these experiments sloppily. This is a valid concern.

Anthony Fauci – despite claiming otherwise on multiple occasions – was in fact aware of the monetary relationship between NIAID, the NIH, EcoHealth Alliance and the Wuhan lab – by January 27, 2020. Fauci also knew that EcoHealth and NIAID worked together to craft a grant policy which would ‘sidestep the Gain-of-Function moratorium at the time.’

In 2016, NIAID determined, conveniently, that this virology work was not subject to the Gain-of-Function (GoF) research funding pause and the subsequent HHS P3CO Framework.”

Scientists working under a 2014 NIH grant to the EcoHealth Alliance to study bat coronaviruses combined the genetic material from a “parent” coronavirus known as WIV1 with other viruses. They twice submitted summaries of their work that showed that, when in the lungs of genetically engineered “humanized” mice, three altered bat coronaviruses at times reproduced far more quickly than the original virus on which they were based. The altered viruses were also somewhat more pathogenic, with one causing the mice to lose significant weight. The researchers reported, “These results demonstrate varying pathogenicity of SARSr-CoVs with different spike proteins in humanized mice.”

If this is not “Gain-of-Function”, then one must ask just what does constitute such experiments?

Even so, the terms of the grant clearly stipulated that the funding could not be used for Gain-of-Function experiments. The grant conditions also required the researchers to immediately report potentially dangerous results and stop their experiments pending further NIH review. According to both the EcoHealth Alliance and NIH, the results were reported to the agency, but NIH determined that rules designed to restrict gain-of-function research did not apply. Again, this represents somewhat convenient semantic game playing, hiding behind definitions of what does and does not constitute Gain of Function, when any disinterested party would be alarmed at the implications of this research, especially using “humanised mice” to enhance the pathogenicity (in humans) of the SARS 1 virus.

“In my point of view, the debate about the definition of ‘Gain-of-Function’ has been too much focused on technical aspects,” said Jacques van Helden, a professor of bioinformatics at Aix-Marseille Université. “The real question is whether or not research has the potential to create or facilitate the selection of viruses that might infect humans.” “The experiments described in the proposal clearly do have that potential”, he said.

Damn straight!

In my view, Anthony Fauci and Peter Daszak, along with prominent virologists like Dr Ralph Baric working in the field of Gain of Function research, have been guilty at the very least of a wanton, and possibly negligent disregard for the ethical considerations and risks inherent in such research, had a naive and credulous belief in the motives and sloppy safety practices of the Chinese Communist Party controlled Wuhan Institute of Virology, and were complicit in skirting around the regulatory pause under the Obama administration on such research in the U.S to single-mindedly pursue this banned research overseas instead, with all the terrible consequences of the COVID-19 pandemic therefore lying, at least in significant part, at their feet. Such mind-boggling hubris should have consequences.

Unfortunately, the end result of the failure of the Virology community to ethically limit itself, or for it to be held accountable for the hubris and cavalier risk taking that led to the COVID 19 pandemic, is that the Pandora’s box of Gain of Function has been opened, and the window of opportunity to close it back up again, if that were ever possible, has now closed.

As a consequence, we have now ushered in the era in the not too distant future of near inevitable biological warfare, where designer chimeric viruses can be manufactured and released to produce pandemics in response to geopolitical conflicts between nations with the capacity to create such bioweapons. These bioweapons may eventually even be specifically targeted to certain ethnic subgroups of the global population, in the worst iteration of this nightmare scenario.

The future survival of the human species is therefore likely, at some point in the future, to be at the mercy of the morality (or lack thereof) of the leadership in nations with the technological expertise to engage in such research, “weaponising” viral pathogens to promote their geopolitical aims or in response to conflict between nations spilling over into asymmetric warfare options such as this.

A Binary Poison: the Root Cause of the 2008 Global Financial Crisis

The amendment to the Community Reinvestment Act in 1995 & the abolition of the Glass-Steagall Act shortly thereafter in 1999 under President Bill Clinton’s Administration, acted as a delayed action binary poison that were at the very foundation of the Global Financial Crisis (GFC) that was to follow a decade later.

What do I mean by the term “binary poison”? This a type of chemical combination, which creates a special reaction or byproduct as a result of mixing two specific chemical reagents together, ones that may well normally have little if any adverse function when taken separately.

Neither of these legislative actions by the Clinton Administration, on their own, could conceivably have led to the catastrophic financial events of the GFC that followed, but it was their combination that proved to be the poisoning of the financial well that would ultimately lead to the “subprime mortgage crisis” in 2008.

Add in the catalyst of the Commodities Futures Modernization Act of 2000, another policy product of the Clinton Administration, speeding up the reaction of this binary poison exponentially through the deregulation of financial derivatives, and you finally have the ultimate witches’ brew that set us on this path to eventual economic ruin.

The Community Reinvestment Act (CRA): (Reagent 1)

Initiated under President Jimmy Carter in 1977, this was a hands off regulatory framework that was allegedly designed to reduce arbitrary defaults on mortgages, but then after intense lobbying by “community organisers”, and litigation helmed by Barack Obama on behalf of ACORN directed in a test case against Citibank, this CRA legislation was Then further amended via the Regulatory Improvement Act under President Bill Clinton.

This 1995 amendment to the Community Reinvestment Act, imposed on banks a legal requirement to lend money to buy homes to millions of poor, mainly black Americans, loans which were then guaranteed by the two biggest US mortgage associations, Fannie Mae and Freddie Mac

Of not insignificant note, no one had campaigned more actively for this change to the law than a young, but already influential Chicago politician and “community organiser” by the name of Barack Obama.*

In 2005, when moves were made to halt Fannie Mae’s reckless guarantees, no one more actively opposed them than the man who, by then, had become Senator Obama. 

As official records also showed, no senator received more campaigning donations from Fannie Mae (although Hillary Clinton ran him close). In 2009, just after Obama had become President, it is somewhat ironic to note that he had, by then, become vociferous in blaming the banks primarily for the crisis.

The Community Reinvestment Act, “well meaning” though it may have been, created the flood of mortgages that were high risk and ultimately unsustainable, which then led on to the sub prime mortgage crisis.

Gramm-Leach-Bliley Act: (Reagent 2)

President Clinton then imposed the Gramm-Leach-Bliley Act, aka the Financial Services Modernization Act of 1999, an act that repealed the 1933 Glass-Steagall Act, which were the four provisions of the Banking Act of 1933 that limited commercial bank securities activities and affiliations between commercial banks and securities firms. These Depression era Glass-Steagall protections effectively had prevented savings banks and their mortgages from being coupled with high risk, speculative financial products like derivatives, while also curtailing the formation of banking leviathans with fingers in too many high risk pies, behemoths so big that allowing them to fail would likely lead to the collapse or even destruction of the entire US, and possibly even the global economic system.

Having permitted the formation of these “super-banks”, repeating the same kinds of structural conflicts of interest that were endemic in the 1920s, allowed lending to speculators, who were packaging and securitising credits and then selling them off, wholesale or retail, and extracting fees at every step along the way. Having created these ever larger banks that were too big to be allowed to fail, it provided perverse incentives for excessive risk taking by those banking corporations, giving them unwavering confidence that they had to be bailed out by taxpayers when things (inevitably) went wrong.

Commodities Futures Modernization Act of 2000: (the Catalyst)

With general (although not universal) agreement in Washington that deregulation of financial services would benefit the economy, the Clinton Administration and Congress decided that the derivatives market should be largely unregulated. This position was codified in the Commodities Futures Modernization Act of 2000 (along with the Gramm-Leach-Bliley Act of 1999). The Clinton Administration and US Treasury Secretary Larry Summers lobbied for the Act, and joined former Treasury Secretary Robert Rubin in both privately and publicly attacking advocates of regulatory oversight of these financial instruments.

In the 1990s, derivatives had been widely heralded as new instruments that could improve the transference of risk. In 1999, then Federal Reserve Chairman Alan Greenspan stated: “By far the most significant event in finance during the past decade has been the extraordinary development and expansion of financial derivatives.” He went on to argue that they “enhance the ability to differentiate risk and allocate it to those investors most able and willing to take it. This unbundling improves the ability of the market to engender a set of product and asset prices far more calibrated to the value preferences of consumers than was possible before derivative markets were developed. The product and asset price signals enable entrepreneurs to finely allocate real capital facilities to produce those goods and services most valued by consumers, a process that has undoubtedly improved national productivity growth and standards of living.” 

When the financial crisis began in 2007, derivatives played a central role and was the catalyst for the spiral out of control that precipitously followed. American Insurance Group (AIG), among others, had sold a form of derivatives, known as credit default swaps or CDS’s (essentially a form of default insurance), on billions of dollars of complex securities ultimately backed by shaky mortgages (what are termed collateral debt obligations or CDOs). AIG believed it was taking on little risk by selling these derivatives because a widespread housing price collapse was not seen as even a remote possibility. Little capital had therefore been set aside to cover potential losses on these swaps, largely because the buyers of the swaps relied on AIG’s stellar credit rating. When CDOs began dropping in value, AIG’s large derivative exposure caused its bankruptcy and required a large government bailout to prevent its counter-parties from taking losses and triggering additional financial instability.

From Wikipedia:

”The Financial Crisis Inquiry Commission reported its findings in January 2011. In briefly summarising its main conclusions the Commission stated: 

While the vulnerabilities that created the potential for crisis were years in the making, it was the collapse of the housing bubble—fuelled by low interest rates, easy and available credit, scant regulation, and toxic mortgages—that was the spark that ignited a string of events, which led to a full-blown crisis in the fall of 2008. Trillions of dollars in risky mortgages had become embedded throughout the financial system, as mortgage-related securities were packaged, repackaged, and sold to investors around the world. 

When the bubble burst, hundreds of billions of dollars in losses in mortgages and mortgage-related securities shook markets as well as financial institutions that had significant exposures to those mortgages and had borrowed heavily against them. This happened not just in the United States but around the world. The losses were magnified by derivatives such as synthetic securities.”

From Larry Elder:

“President Bill Clinton, in 1995, added teeth to the CRA. Economists Stephen Moore and Lawrence Kudlow explained: “Under Clinton’s Housing and Urban Development (HUD) secretary, Andrew Cuomo, Community Reinvestment Act regulators gave banks higher ratings for home loans made in ‘credit-deprived’ areas. 

Banks were effectively rewarded for throwing out sound underwriting standards and writing loans to those who were at high risk of defaulting. If banks didn’t comply with these rules, regulators reined in their ability to expand lending and deposits.

These new HUD rules lowered down payments from the traditional 20 percent to 3 percent by 1995 and zero down-payments by 2000. What’s more, in the Clinton push to issue home loans to lower income borrowers, Fannie Mae and Freddie Mac made a common practice to virtually end credit documentation, low credit scores were disregarded, and income and job history was also thrown aside. The phrase ‘subprime’ became commonplace. What an understatement.”

Peter Wallinson, Arthur F. Burns Chair in Financial Policy Studies and co-director of the American Enterprise Institute (AEI), in his book “Hidden in Plain Sight: What Really Caused the World s Worst Financial Crisis and Why It Could Happen Again” uses data point after data point after data point to incontestably prove that:

(a) 31 million “non-traditional” mortgages (NTMs) came into the American economy by 2008 (NTM being the more important classification than the media’s love of the term “subprime mortgages”, as it demonstrates where the real toxicity proved to be – in the combination of subprime with “Alt-A” – ‘alternatives to agencies’, or loans that didn’t meet the underwriting standards of Fannie/Freddie; these 31 million NTM’s were at the heart of the crisis; and

(b) They were the direct result of government housing policy which created them into being; and

(c) The mere existence of this massive bloc of troubled mortgages was bad enough, but the extensive effort to cover up their existence – to hide or suppress the real number of NTM’s in the system, – led to a very faulty climate for risk-taking; and

(d) Efforts to continue legislating social mandates via government housing policy will continue to lead to economic instability and even repeated crises.

There would have been no financial crisis without aggressive government mandates to increase lending where it did not belong, and that misguided belief that social policy could be administered through the economic experimentation of various lending mandates which created an intensely malignant culture in the world of mortgage lending.

Peter Wallison further proves beyond any shadow of any doubt that Fannie Mae and Freddie Mac did not merely get “caught up in the pursuit of ill-gotten gains,” as is often alleged, mostly as an attempt to blame free market greed for Fannie/Freddie’s decidedly non-free market troubles. Rather, Wallison proves, Fannie and Freddie got heavy into the NTM world because they had no choice if they were to meet their government mandates.

The Obama Administration’s Financial Crisis Inquiry Commission, then proved to be an utter waste of taxpayer money, assembled by Congress in July 2009 to thoroughly investigate the crisis and its causes. It was not just a waste of money because the commission’s findings got almost everything wrong (Peter Wallison being the sole dissenter from the Commission’s pre-baked conclusions), but also because the whole pre-text for the inquiry was to guide changes in policy that could stem the tide in the future. The Obama administration’s remedy to the financial crisis – the Dodd-Frank legislation – was rammed through Congress into law before this commission had even completed its work or submitted its findings. It was therefore a show pony of an investigation that was a pre-ordained, window dressing type of inquiry to give the appearance of addressing the crisis, and to “inform” remedial legislation that had already been formulated and enacted independent of its conclusions.

A permanent financial truth can clearly be derived from these events: namely that you are not doing people a favour by making them loans they cannot afford—just the opposite.

It can therefore be seen, from all that has been elucidated above, that it was the combination of these two aforementioned momentous decisions, the amendment to the Community Reinvestment Act and the subsequent abolition of the Glass Steagall Act, but not each of them in isolation, that ultimately caused the Global Financial Crisis that followed, as night follows day, a decade down the road these decisions had set them upon.

U.S politicians, and so called “economic experts”, have thereafter covered up serially for this failure (and continue to do so to this day) with various excuses and mental contortions, generally to absolve themselves, and their favoured politicians and policymakers, from receiving the blame they so richly deserve.

Not that President Bill Clinton, or the young activist lawyer/community organiser Barack Obama (the latter of whom lobbied for one half of the binary poison), can take total “credit” (i.e. blame) for causing the Global Financial Crisis, because there are plenty of other actors who had important roles in this melodrama.

Here’s a partial list of those alleged to be at fault:

(a) The Federal Reserve, which slashed interest rates after the dot-com bubble burst, making credit cheap.

(b) Home buyers, who took advantage of easy credit to bid up the prices of homes excessively.

(c) Congress, which continues to support a mortgage tax deduction that gives consumers a tax incentive to buy more expensive houses.

(d) Real estate agents, most of whom work for the sellers rather than the buyers and who earned higher commissions from selling more expensive homes.

(e) The Clinton administration, which pushed for less stringent credit and downpayment requirements for working- and middle-class families.

(f) Mortgage brokers, who offered less-credit-worthy home buyers subprime, adjustable rate loans with low initial payments, but exploding interest rates.

(g) Former Federal Reserve chairman Alan Greenspan, who in 2004, near the peak of the housing bubble, encouraged Americans to take out adjustable rate mortgages.

(h) Wall Street firms, who paid too little attention to the quality of the risky loans that they bundled into Mortgage Backed Securities (MBS), and issued bonds using those securities as collateral.

(i) The Bush administration, which failed to provide needed government oversight of the increasingly dicey mortgage-backed securities market.

(j) An obscure accounting rule called mark-to-market, which can have the paradoxical result of making assets be worth less on paper than they are in reality during times of panic.

(k) Collective delusion, or a belief on the part of all parties that home prices would keep rising forever, no matter how high or how fast they had already gone up, and that mortgage interest rates would remain low indefinitely.

The financial crisis would have never happened, however, without the accelerated attempt of Washington D.C., and the Clinton Administration in particular, to legislate social policy through housing policy, and then combining it with the abolition of regulatory Acts meant to protect against high risk speculation being coupled with standard banking tasks in the mortgage lending and savings sectors.

Footnote:

*President Barack Obama was a pioneering contributor to the national subprime real estate bubble.

Roughly half of the 186 African-American clients in his landmark 1995 mortgage discrimination lawsuit against Citibank have since gone bankrupt or received foreclosure notices. 

As few as 19 of those 186 clients still own homes with clean credit ratings, following a decade in which Obama and other progressives pushed banks to provide mortgages to poor African Americans. 

The startling failure rate among Obama’s private sector clients was discovered during The Daily Caller’s review of previously unpublished court information from the lawsuit that a young Obama helmed as the lead plaintiff’s attorney.