Hang tight while we fetch the video data and transcripts. This only takes a moment.
Connecting to YouTube player…
Fetching transcript data…
We’ll display the transcript, summary, and all view options as soon as everything loads.
Next steps
Loading transcript tools…
The Art of Climate Modeling Lecture 01 - Overview / History | Introduction to Atmospheric Dynamics | YouTubeToText
YouTube Transcript: The Art of Climate Modeling Lecture 01 - Overview / History
Skip watching entire videos - get the full transcript, search for keywords, and copy with one click.
Share:
Video Transcript
Video Summary
Summary
Core Theme
This content introduces the field of global atmospheric modeling, tracing its historical development from early meteorological observations to sophisticated Earth system models used for weather prediction and climate projection. It highlights the fundamental concepts of discretization, model components (dynamical core and physical parameterizations), and the evolution of modeling techniques driven by advancements in computing power and scientific understanding.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
over the past 50 years earth system
models have given us incredible insight
into the influence of the changing
climate on regional and global scales
this course aims to provide an
introduction to global atmospheric
modeling and the tools we use for
understanding our planet's changing
climate topics covered include global
atmospheric models from around the world
numerical discretizations in the
horizontal vertical and temporal dimensions
dimensions
diffusion filters fixers kinetic energy spectra
spectra
physical parameterizations computing on
large-scale parallel systems
and model evaluation and inter-comparison
inter-comparison
in today's lecture we'll set the stage
for the course providing an overview of
global modeling
and exploring the history of atmospheric modeling
modeling
although regional atmospheric models
were first developed for predicting the weather
weather
the first global earth system models
were primarily intended as a means of
understanding the general circulation of
the atmosphere
however these models were the slow
culmination of centuries of scientific
developments and research
that built up our understanding of the
earth's atmosphere
later ocean and cryosphere models were
developed that captured these components
of the earth's system
global atmospheric models climate models
and earth system models have been
used as predictive models on time scales
of days to weeks
primarily for numerical weather
prediction or up to centuries
for long-term climate projections in
paleoclimate research
both stand-alone and coupled modeling
systems have been used as virtual
laboratories for studying the earth system
system
and exploring the past and future
climate of the planet
they have even been used to model other
planet's atmospheres
global atmospheric models can be run on
modern computer systems through a
process known as discretization
not every molecule in the earth's
atmosphere is tracked by these models
but instead we are interested in
modeling and projecting average
conditions over a particular region
by dividing the planet into many of
these small regions capturing the
processes within those regions
and capturing the processes that enable
different regions to communicate with
each other
we are able to build up a representation
of the whole system
discretization in 3d means also dividing
the vertical layers of the earth's
atmosphere into distinct slices
processes and features that are captured
in each grid cell
include solar radiation surface
atmosphere fluxes of water and energy
surface friction rivers industrial
emissions agriculture glaciers
ecosystems biogeochemistry vegetation
and soils
because these regions are usually laid
out on a grid to ensure complete
coverage of the planet
these regions are referred to as grid cells
cells
atmospheric models are roughly divided
into two components
the dynamical core and the physical parameterizations
parameterizations
the dynamical core is responsible for
solving the partial differential
equations that govern atmospheric flows
and so handles most transport and
exchange between grid cells
the physical parameterizations handle
all the processes that are not captured
by the equations of motion
including moisture ice land chemistry
radiation and others
in general physical parameterizations
are only applied in
individual columns that is grid cells
that extend through
climate models and numerical weather
prediction models used to be very different
different
as they were motivated by different
needs climate models need to include
processes that operate on long time scales
scales
and need to have sufficiently coarse
grid spacing to enable long-term integrations
integrations
numerical weather prediction models or
nwp models on the other hand require
high spatial resolution or fine grid
spacing to maximize accuracy
and include fewer physical processes so
that forecasts can be turned around
on demand however over the past few
decades the line between climate models
and numerical weather prediction models
has become blurred
many major modeling centers worldwide
have developed models that can be used
both for climate modeling and nwp
as mentioned earlier regional
atmospheric models were the first ones
developed for numerical weather prediction
prediction
unlike global models regional models
have lateral boundaries that need to be
specified for the system to be solved
however because they only capture a
portion of the earth's surface
these models can be run with finer
resolution smaller grid cells
at a lower computational cost than
global models
global models have no lateral conditions
since the earth wraps around in every direction
direction
however boundary conditions still need
to be specified at the bottom and top of
the atmosphere
with that said newer developments have
allowed global models to avoid issues
related to lateral boundary conditions
while achieving higher resolution over a
limited area
these variable resolution modeling
systems have grown increasingly popular
for global modeling
here we see three choices of grid
systems that feature local high resolution
resolution
including the conformal cube sphere grid
on the left the centroidal voronoi
tessellation in the center
and the conformal cubesphere grid on the right
right
have the opportunity later on to explore
these models in detail
but first it's useful to loop look back
on the history of
global atmospheric modeling to
understand how these models were developed
developed
weather has always been fascinating to
humankind featuring prominently in the
oral and written history of our ancestors
ancestors
even before the invention of
meteorological devices people would
stare at the sky and observe the
movement of clouds
or the development of thunderstorms
meteorological lore was slowly developed
that provided loose guidance related to
weather prediction
for example from the bible we have this
little meteorological tidbit
when evening comes you say it will be
fair weather for the sky is red
and in the morning today it will be
stormy for the sky is red
and overcast in this case very different
conditions were anticipated to arise
depending on whether or not the sky was
red in the morning or in the evening
we know now that a red sky indicates the
pressure of a hype
at the presence of a high pressure
system if it's in the evening then a
high pressure system is moving in from
the west bringing fair conditions
if it's in the morning then a high
pressure system is departing
likely to be followed by a stormy low
pressure system
other tidbits include seagull seagulls
sit on the sand
it's never good weather when you're on
land which originated in the observation
that storms at seas will drive the
the first real scientific developments
related to meteorology came in the 1600s
when qualitative observations could
finally be quantified
in 1643 evangelisca
turiselli invented the barometer which
allowed him to measure the pressure of air
air
he observed that the pressure of air was
highly correlated with the weather
as with the red sky lore earlier he
observed that the dropping barometric pressure
pressure
signaled an oncoming storm in 1664
francesco foley invented the first
practical hydrometer
which was capable of measuring the
temperature measurements came next in
1709 german physicist and engineer
daniel gabriel fahrenheit developed the
alcohol thermometer
and later the mercury thermometer he was
also responsible for the development of
the fahrenheit scale used in the united
states which he proposed in 1724
with these observing devices in hand
this in 1765
french chemist antoine laurent de
lavoisier began making daily
measurements of air pressure
moisture content wind speed and
direction he also envisioned the first
weather forecasts
as captured in this quote it is almost possible
possible
to predict one or two days in advance
within a rather broad range of probability
probability
what the weather is going to be it is
even thought that it will not be
impossible to publish
daily forecasts which would be very
useful to society
antoine laurent de la voce was quite the overachiever
overachiever
he is commonly referred to as the father
of modern chemistry because of his heavy
involvement in discoveries that led to
the development of modern chemistry
for example he stated the first version
of the law of conservation of mass
his efforts in defining the metric system
system
and for writing the first exhaustive
list of chemical elements
by the 1800s the industrial revolution
was beginning to drive the need to
predict and understand weather patterns
the invention of the electric telegraph
in 1837 at last
enabled a method for communicating
weather patterns over a vast
geographical area
and understanding how weather prediction
was intimately tied to the transport of
these weather systems
in 1849 under the leadership of joseph henry
henry
the smithsonian began establishing a
meteorological observing network across
the united states
however the idea of a national system
for predicting the weather was generally
slow to take off in both
europe and the us the advent of modern
meteorology didn't really start until 1854
1854
when a french warship and 38 merchant
vessels sank in a violent storm in the
northwest of the black sea
the director of the paris observatory
urban le verrier
was directed to investigate he
discovered that the storm had formed two
days earlier in the southeast
if a tracking system had been in place
then it could certainly have given prior
warnings to
the ships and allowed the debt disaster
to be averted
urban le verrier's conclusion was taken
to heart and only one year later in 1855
a national storm warning surface was
established in france a few years later in
in
england robert fitzroy used the newly
installed telegraph system to produce
the first synoptic charts of england
that depicted wind speeds and pressure systems
systems
he was the first to coin the term
weather forecast to mean future
predictions of weather systems
and even publish the first ever
forecasts of this type
recognition for the importance of
meteorological prediction led to the
formation of the international
meteorological organization in vienna
in 1873 on the other side of the
atlantic the u.s army signal corp
forerunner to the national weather
service issued its first hurricane warning
warning
into the early 1900s weather forecasts
continue to be constructed using
personal expertise
and through analysis of historical
weather patterns
however in 1916 the famous physicist and
meteorologist wilhelm bjerknes
had at last assembled the first complete
set of equations of motion of the atmosphere
atmosphere
building upon the theory of fluids these
equations corresponded to the principles
of mass conservation
momentum conservation and thermodynamics
the equations were collectively known as
the primitive equations
and are taught in every introductory
meteorologi meteorology course
with these equations in hand british
meteorologist lewis fry richardson
proposed using numerical computations to solve
solve
for the time tendency of each term over
a grid
and so conduct the first numerical
weather forecast
his work entitled weather prediction by
numerical process was published in 1922
proposing a mathematical technique for
systematic forecasting
deployed during the first world war as
part of the quaker ambulance unit
richardson had made the first attempt at
mathematically forecasting the weather
for a single day
in this case the 20th of may 1910 using
initial data at 7 am to predict
the next 6 hours the calculation took
roughly 3 months to complete
and predicted a huge rise in pressure on
the order of 145 millibars
unfortunately observations showed that
pressure remained more or less static
that day
a dramatic failure of the proposed
method although discouraging after
months of work this did not prevent
a subsequent and detailed analysis by
meteorologist peter lynch
showed that the differences could be
tracked to insufficient numerical smoothing
smoothing
which in turn led to instability in the calculation
calculation
proper numerical integration techniques
that avoided such instabilities were not
developed until years later
and when those techniques were applied
to the forecast problem the results
were essentially accurate given that a
three month long calculation for a
single day's forecast would be
particularly useless
lewis fry richardson proposed a method
to parallelize the calculations to
improve throughput
his idea was thus after so much hard
reasoning may one play with a fantasy
imagine a large hall like a theater
except that the circles
and galleries go right round through the
space usually occupied by the stage
the walls of this chamber are painted to
form a map of the globe
the ceiling represents the north polar regions
regions
england is in the gallery the tropics in
the upper circle
and australia on the dress circle and
the antarctic in the pit
a myriad of computers are at work upon
the weather of each part of the map
where each sits
but each computer attends only to one
equation or part of an equation
notably richard's computers here
referred to actual people performing
computations by hand
as he was unaware of the developments
that would come 30 years later with
modern computing systems
one issue with the bjerknes formulas
used by richardson was that their
complexity made it difficult to quickly
perform computations
later work on the quasi-geostrophic or
qg equations helped in simplifying the
equation set
and improved forecast performance the
partial differential equations
describing fluid motion were applicable
in a variety of contexts
by the 1940s investments in electric
computer systems led to
eniac the first programmable computer system
system
in the late 1940s john von neumann
successfully led a team
to use this system to compute and
understand the behavior of explosions
a topic of significant military value at
the time
von neumann was a visionary and saw the
parallels between his work and many
other fields including numerical weather prediction
prediction
and so he advocated for the use of
computers to model the atmosphere
stretched thin with his own efforts von
neumann recruited
jules gregory charney to develop a
numerical framework for numerical
weather prediction notably
charney was an alumnus of the lab of
famous meteorologist carl gustaf rosby
at the university of chicago
with access to eniac at princeton
charney conducted the first successful
numerical weather prediction experiment
in 1950 only a few short years later
in 1954 the first real-time numerical
weather prediction experiment was
performed by the royal swedish air force
while numerical weather prediction had
obvious direct applications
these new numerical models were seen as
powerful tools for understanding global
atmospheric circulations
the first atmospheric general
circulation model or gcm
was developed in 1955 by norman phillips
at princeton university
his computer held five kilobytes of
memory plus 10 kilobytes of data storage
and successfully modeled a two-layer
atmosphere on a cylinder
17 cells high and 16 cells in circumference
circumference
based on phillips work from 1958 to 1965
joseph smagerinski from the u.s weather
bureau and sayakuro manabi
developed the first three-dimensional
atmospheric model built from the
primitive equations
this led to the geophysical fluid
dynamics laboratory
and another team from 1956 to 1964
motivated by philip's paper yale mintz
recruited akiyo erikawa to develop a
two-layer model with realistic topography
topography
this led to the ucla family of models
and this work was incorporated into
later work by the european center for
medium-range weather forecasts ecmwf
from here there was an explosion in
research related to global circulation models
models
the different codes and methods gave
rise to a complex family tree of
atmospheric general circulation models
or agcms
we saw new and derived models arising
from the national center for atmospheric research
research
the goddard institute for space sciences
the uk med office and others
codes were often shared between
different modeling centers as well
making an amalgamation of many different
code bases
and programs that could be used for
modeling the atmosphere
1965 a panel of the us national academy
of sciences reported that
although global models were largely
successful at reproducing gross features
of the atmosphere
there was were significant shortfalls in
these models that could only be
addressed by substantially increased
computing power
this motivated heavy investments in
improving computer power
and kicked off the exponential growth
and computing power that we've seen today
today
in parallel the late 1900s saw a period
of continued algorithmic development
and model improvement from the 1970s
through present continued algorithmic
developments have reduced the computing
time required by models
and increased their robustness and
accuracy numerical methods from the
study of computational fluid dynamics
which is particularly important in the
field of aerospace
have led to dramatic advancements in the
quality of general circulation models
the climate models of the mid-1970s
handled little more than the fluid
equations governing atmospheric dynamics
they had simple parameterizations for
solar radiation and precipitation
as these are substantial sources and
sinks of energy
by the mid-1980s the first land surface
models were being employed
which enabled a better handling of
moisture fluxes further
these land surface models prescribed ice
sheets and estimates of cloudiness
allowed us to better estimate albedo and
so refine the radiative energy budget
so-called swamp ocean models were added
around 1990 which allowed oceans to act
as reservoirs for heat
but didn't feature oceanic circulations
these were further refined over the next
decade and models with oceanic currents
were introduced in the mid 1990s
with the eruption of pinatubo in 1991
and the growing threat of acid rain
there was also substantial interest in
the effort in the effects of volcanism and
and
sulfate aerosols on the climate system
further advancements into the early
2000s brought these models closer to reality
reality
microphysics parameterizations were
developed that could better simulate
precipitation and use aerosol
concentrations to determine
precipitation rates
a more detailed hydrologic cycle was
added to land models that incorporated precipitation
precipitation
soil moisture and runoff into a unified system
system
ocean models that simulated the full 3d
structure of the ocean were also developed
developed
incorporating the deep ocean
circulations further
many models were augmented to include
the carbon cycle
through the late 2000s more nuanced
functionality came online
including atmospheric chemistry
parameterizations and interactive vegetation
vegetation
since then river and estuary models wave
models ice sheets and marine ecosystems
have been come
have become commonplace in climate
modeling systems
in response to the development of
increasingly powerful numerical and data
assimilation algorithms
along with exponentially increasing
computational performance
numerical weather prediction forecast
skill has been steadily rising since the 1950s
1950s
here we see a plot from the national
oceanic and atmospheric administration
of 500 millibar operational forecast skill
skill
for 36 hour and 72 hour forecasts
a typical metric for assessing the
quality of forecast models
overall these models have increased from
around 20 skill during the advent of nwp
to nearly 90 skill today 72 hour
forecasts have generally lagged 36 hour
forecasts by about 15 years
with modern forecasts around 70 skill
here's an analogous plot of forecast
skill measured via anomaly correlation
from the european center for medium
range weather forecasts
skill scores have been rising fairly
steadily and differences in skill that
originally persisted between southern
hemisphere and northern hemisphere forecasts
forecasts
have largely been erased since the early 2000s
2000s
even 10-day forecasts now show 50
percent skill
under this metric new investments are
being made in sub-seasonal to seasonal
or s-2s forecasting methods as well
which rely on low-frequency modes of
variability connected to atmosphere
ocean interactions
one such mode that dominates the
equatorial tropics the madden julian oscillation
oscillation
allows for skill in forecasting on time
scales of up to a month
ensemble methods which rely on multiple
simultaneous forecasts performed in parallel
parallel
each tweaked to capture uncertainties in
the original fields
allow for probabilistic forecasts up to
12 months in advance
one such system currently being employed
for long-term forecasting over north america
america
is the north american multi-model
ensemble or nme
the supercomputers we now use for global
climate modeling have grown increasingly powerful
powerful
in response to roughly exponential
growth in the performance of modern chipsets
chipsets
in the united states most climate
simulations are now performed on the
national center for atmospheric research
cheyenne supercomputer
which clocks in at 5.34 petaflops
over 145 152 computing cores
or the us department of energy corey
supercomputer which clocks in at 14.01 petaflops
petaflops
over 622 336 computing cores
here flops refers to one floating point
operation per second
supercomputing performance has
consistently increased from
early 1990s through the 2010s however in
the 2010s physical limitations related
to die
size and energy costs over the past
decade require to switch to new hardware
models that have slowed this growth somewhat
somewhat
graphical processing units or gpus which
are perhaps more widely recognized for
their use in computer gaming
allow a three-time speed up per watt
over cpus but require more exposed parallelism
parallelism
nonetheless exponential growth persists
at a slightly slower rate
as supercomputing systems move towards
graphical processing units
to achieve higher throughput all new us
department of energy supercomputers in
the coming decade
including perlmutter frontier and aurora
are anticipated to be largely gpu based
supercomputers have enabled global
climate models and nwp models to reach
finer and finer grid spacing
in 2019 models from around the world
produced 40-day simulations with grid
spacing of only a few kilometers
as part of the diamond project here we
see a two-day video from a recent
simulation by the u.s department of energy
energy
simple cloud-resolving e3sm atmosphere model
model
at three kilometer global uniform
resolution depicting an atmospheric
river making landfall in north america
grayscale in this image depicts column
integrated water vapor
while colors indicate precipitation
global climate models and nwp models are
moving closer together
nowadays unified frameworks like the uk
met office unified modeling system
enable both nwp simulations and global
climate system simulations to be
conducted within a single framework
here we see one such simulation of
hurricane sandy making landfall in the
continental united states
conducted with the community earth
okay that's everything for today in the
next lecture we'll start diving into
some of the technical details behind
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.