Tagged: statistical mechanics Toggle Comment Threads | Keyboard Shortcuts

  • Joseph Nebus 6:00 pm on Saturday, 24 September, 2016 Permalink | Reply
    Tags: handedness, statistical mechanics,   

    Some Thermomathematics Reading 


    I have been writing, albeit more slowly, this month. I’m also reading, also more slowly than usual. Here’s some things that caught my attention.

    One is from Elke Stangl, of the Elkemental blog. “Re-Visiting Carnot’s Theorem” is about one of the centerpieces of thermodynamics. It’s about how much work you can possibly get out of an engine, and how much must be lost no matter how good your engineering is. Thermodynamics is the secret spine of modern physics. It was born of supremely practical problems, many of them related to railroads or factories. And it teaches how much solid information can be drawn about a system if we know nothing about the components of the system. Stangl also brings ASCII art back from its Usenet and Twitter homes. There’s just stuff that is best done as a text picture.

    Meanwhile on the CarnotCycle blog Peter Mandel writes on “Le Châtelier’s principle”. This is related to the question of how temperatures affect chemical reactions: how fast they will be, how completely they’ll use the reagents. How a system that’s reached equilibrium will react to something that unsettles the equilibrium. We call that a perturbation. Mandel reviews the history of the principle, which hasn’t always been well-regarded, and explores why it might have gone under-appreciated for decades.

    And lastly MathsByAGirl has published a couple of essays on spirals. Who doesn’t like them? Three-dimensional spirals, that is, helixes, have some obvious things to talk about. A big one is that there’s such a thing as handedness. The mirror image of a coil is not the same thing as the coil flipped around. This handedness has analogues and implications through chemistry and biology. Two-dimensional spirals, by contrast, don’t have handedness like that. But we’ve groups types of spirals into many different sorts, each with their own beauty. They’re worth looking at.

     
  • Joseph Nebus 3:02 pm on Wednesday, 21 October, 2015 Permalink | Reply
    Tags: statistical mechanics,   

    Phase Equilibria and the usefulness of μ 


    The Carnot Cycle blog for this month is about chemical potential. “Chemical potential” in thermodynamics covers a lot of interesting phenomena. It gives a way to model chemistry using the mechanisms of statistical mechanics. It lets us study a substance that’s made of several kinds of particle. This potential is written with the symbol μ, and I admit I don’t know how that symbol got picked over all the possible alternatives.

    The chemical potential varies with the number of particles. Each different type of particle gets its own chemical potential, so there may be a μ1 and μ2 and μ3 and so on. The chemical potential μ1 is how much the free energy varies as the count of particles-of-type-1 varies. μ2 is how much the free energy varies as the count of particles-of-type-2 varies, and so on. This might strike you as similar to the way pressure and volume of a gas depend on each other, or if you retained a bit more of thermodynamics how the temperature and entropy vary. This is so. Pressure and volume are conjugate variables, as are temperature and entropy, and so are chemical potential and particle number. (And for a wonder, “particle number” means exactly what it suggests: the number of particles of that kind in the system.)

    Liked by 1 person

    carnotcycle

    gibbs

    It was the American mathematical physicist Josiah Willard Gibbs who introduced the concepts of phase and chemical potential in his milestone monograph On the Equilibrium of Heterogeneous Substances (1876-1878) with which he almost single-handedly laid the theoretical foundations of chemical thermodynamics.

    In a paragraph under the heading “On Coexistent Phases of Matter” Gibbs mentions – in passing – that for a system of coexistent phases in equilibrium at constant temperature and pressure, the chemical potential μ of any component must have the same value in every phase.

    This simple statement turns out to have considerable practical value as we shall see. But first, let’s go through the formal proof of Gibbs’ assertion.

    An important result

    pe01

    Consider a system of two phases, each containing the same components, in equilibrium at constant temperature and pressure. Suppose a small quantity dni moles of any component i is transferred from phase A in…

    View original post 819 more words

     
  • Joseph Nebus 9:22 pm on Tuesday, 12 May, 2015 Permalink | Reply
    Tags: , statistical mechanics, ,   

    Reversible and irreversible change 


    Entropy is hard to understand. It’s deceptively easy to describe, and the concept is popular, but to understand it is challenging. In this month’s entry CarnotCycle talks about thermodynamic entropy and where it comes from. I don’t promise you will understand it after this essay, but you will be closer to understanding it.

    Like

    carnotcycle

    rev01

    Reversible change is a key concept in classical thermodynamics. It is important to understand what is meant by the term as it is closely allied to other important concepts such as equilibrium and entropy. But reversible change is not an easy idea to grasp – it helps to be able to visualize it.

    Reversibility and mechanical systems

    The simple mechanical system pictured above provides a useful starting point. The aim of the experiment is to see how much weight can be lifted by the fixed weight M1. Experience tells us that if a small weight M2 is attached – as shown on the left – then M1 will fall fast while M2 is pulled upwards at the same speed.

    Experience also tells us that as the weight of M2 is increased, the lifting speed will decrease until a limit is reached when the weight difference between M2 and M1 becomes…

    View original post 692 more words

     
    • sheldonk2014 9:51 pm on Tuesday, 12 May, 2015 Permalink | Reply

      The only dynamics I want to experience is how much strength does it take to lift my but or the kinetics it takes it takes to lift my arm to my noise
      As always Sheldon

      Like

      • Joseph Nebus 4:11 pm on Friday, 15 May, 2015 Permalink | Reply

        Ah, but even those dynamics are amazing. And the way the body works can tell us amazing things about the way physics works: Julius von Mayer’s observations that people’s blood was a deeper red — holding more oxygen — in the tropics compared to in Europe was one of the pieces leading people to the conservation of energy. Hermann von Helmholtz’s career in physics was inspired, in part, by a teacher proclaiming no one would ever know how fast a nerve impulse travelled; and he didn’t believe it, and became one of science’s immortals. There’s astounding things like this everywhere.

        Like

  • Joseph Nebus 8:12 pm on Tuesday, 14 April, 2015 Permalink | Reply
    Tags: , statistical mechanics, Thermodyanmics,   

    Spontaneity and the performance of work 


    I’d wanted just to point folks to the latest essay in the CarnotCycle blog. This thermodynamics piece is a bit about how work gets done, and how it relates to two kinds of variables describing systems. The two kinds are known as intensive and extensive variables, and considering them helps guide us to a different way to regard physical problems.

    Like

    carnotcycle

    spn01

    Imagine a perfect gas contained by a rigid-walled cylinder equipped with a frictionless piston held in position by a removable external agency such as a magnet. There are finite differences in the pressure (P1>P2) and volume (V2>V1) of the gas in the two compartments, while the temperature can be regarded as constant.

    If the constraint on the piston is removed, will the piston move? And if so, in which direction?

    Common sense, otherwise known as dimensional analysis, tells us that differences in volume (dimensions L3) cannot give rise to a force. But differences in pressure (dimensions ML-1T-2) certainly can. There will be a net force of P1–P2 per unit area of piston, driving it to the right.

    – – – –

    The driving force

    In thermodynamics, there exists a set of variables which act as “generalised forces” driving a system from one state to…

    View original post 290 more words

     
  • Joseph Nebus 9:40 pm on Thursday, 8 May, 2014 Permalink | Reply
    Tags: , Kelvin, , statistical mechanics, ,   

    The ideal gas equation 


    I did want to mention that the CarnotCycle big entry for the month is “The Ideal Gas Equation”. The Ideal Gas equation is one of the more famous equations that isn’t F = ma or E = mc2, which I admit is’t a group of really famous equations; but, at the very least, its content is familiar enough.

    If you keep a gas at constant temperature, and increase the pressure on it, its volume decreases, and vice-versa, known as Boyle’s Law. If you keep a gas at constant volume, and decrease its pressure, its temperature decreases, and vice-versa, known as Gay-Lussac’s law. Then Charles’s Law says if a gas is kept at constant pressure, and the temperature increases, then the volume increases, and vice-versa. (Each of these is probably named for the wrong person, because they always are.) The Ideal Gas equation combines all these relationships into one, neat, easily understood package.

    Peter Mander describes some of the history of these concepts and equations, and how they came together, with the interesting way that they connect to the absolute temperature scale, and of absolute zero. Absolute temperatures — Kelvin — and absolute zero are familiar enough ideas these days that it’s difficult to remember they were ever new and controversial and intellectually challenging ideas to develop. I hope you enjoy.

    Like

    carnotcycle

    es01

    If you received formal tuition in physical chemistry at school, then it’s likely that among the first things you learned were the 17th/18th century gas laws of Mariotte and Gay-Lussac (Boyle and Charles in the English-speaking world) and the equation that expresses them: PV = kT.

    It may be that the historical aspects of what is now known as the ideal (perfect) gas equation were not covered as part of your science education, in which case you may be surprised to learn that it took 174 years to advance from the pressure-volume law PV = k to the combined gas law PV = kT.

    es02

    The lengthy timescale indicates that putting together closely associated observations wasn’t regarded as a must-do in this particular era of scientific enquiry. The French physicist and mining engineer Émile Clapeyron eventually created the combined gas equation, not for its own sake, but because he needed an…

    View original post 1,628 more words

     
  • Joseph Nebus 8:52 pm on Thursday, 6 February, 2014 Permalink | Reply
    Tags: , , science history, statistical mechanics,   

    The Liquefaction of Gases – Part I 


    I know, or at least I’m fairly confident, there’s a couple readers here who like deeper mathematical subjects. It’s fine to come up with simulated Price is Right games or figure out what grades one needs to pass the course, but those aren’t particularly challenging subjects.

    But those are hard to write, so, while I stall, let me point you to CarnotCycle, which has a nice historical article about the problem of liquefaction of gases, a problem that’s not just steeped in thermodynamics but in engineering. If you’re a little familiar with thermodynamics you likely won’t be surprised to see names like William Thomson, James Joule, or Willard Gibbs turn up. I was surprised to see in the additional reading T O’Conor Sloane show up; science fiction fans might vaguely remember that name, as he was the editor of Amazing Stories for most of the 1930s, in between Hugo Gernsback and Raymond Palmer. It’s often a surprising world.

    Like

    carnotcycle

    On Monday 3 December 1877, the French Academy of Sciences received a letter from Louis Cailletet, a 45 year-old physicist from Châtillon-sur-Seine. The letter stated that Cailletet had succeeded in liquefying both carbon monoxide and oxygen.

    Liquefaction as such was nothing new to 19th century science, it should be said. The real news value of Cailletet’s announcement was that he had liquefied two gases previously considered ‘non condensable’.

    While a number of gases such as chlorine, carbon dioxide, sulfur dioxide, hydrogen sulfide, ethylene and ammonia had been liquefied by the simultaneous application of pressure and cooling, the principal gases comprising air – nitrogen and oxygen – together with carbon monoxide, nitric oxide, hydrogen and helium, had stubbornly refused to liquefy, despite the use of pressures up to 3000 atmospheres. By the mid-1800s, the general opinion was that these gases could not be converted into liquids under any circumstances.

    But in…

    View original post 1,350 more words

     
    • Damyanti 6:47 am on Thursday, 13 February, 2014 Permalink | Reply

      I usually run far from all topics science-related, but I like this little bit of history here.

      Like

      • Joseph Nebus 11:46 pm on Thursday, 13 February, 2014 Permalink | Reply

        I’m glad you do enjoy. I like a good bit of history myself, mathematics and science included, and might go looking for more topics that have a historical slant.

        Like

    • LFFL 10:43 pm on Sunday, 23 February, 2014 Permalink | Reply

      You lost me at “deeper mathematical subjects”. I barely have addition & subtraction down.

      Like

      • Joseph Nebus 4:43 am on Monday, 24 February, 2014 Permalink | Reply

        Aw, but the deeper stuff is fascinating. For example, imagine you have a parcel of land with some really complicated boundary, all sorts of nooks and crannies and corners and curves and all that. If you just walk around the outside, keeping track of how far you walk and in what direction, then, you can use a bit of calculus to tell exactly how much area is enclosed by the boundary, however complicated a shape it is.

        Isn’t that amazing? You never even have to set foot inside the property, just walk around its boundary.

        Like

        • LFFL 4:46 am on Monday, 24 February, 2014 Permalink | Reply

          Wow. I’m impressed by your brain power. I just wasn’t born with a brain for much math beyond the basics.

          Like

          • Joseph Nebus 4:25 am on Tuesday, 25 February, 2014 Permalink | Reply

            Aw, you’re kind to me, and unkind to you. It’s not my brainpower, at least. The result is a consequence of some pretty important work you learn early on in calculus, and I’d expect you could understand the important part of it without knowing more than the basics.

            Like

  • Joseph Nebus 9:49 pm on Wednesday, 13 November, 2013 Permalink | Reply
    Tags: , , , , , statistical mechanics   

    Reading the Comics, November 13, 2013 


    For this week’s round of comic strips there’s almost a subtler theme than “they mention math in some way”: several have got links to statistical mechanics and the problem of recurrence. I’m not sure what’s gotten into Comic Strip Master Command that they sent out instructions to do comics that I can tie to the astounding interactions of infinities and improbable events, but it makes me wonder if I need to write a few essays about it.

    Gene Weingarten, Dan Weingarten, and David Clark’s Barney and Clyde (October 30) summons the classic “infinite monkeys” problem of probability for its punch line. The concept — that if you had something producing strings of letters at random (taken to be monkeys because, I suppose, it’s assumed they would hit every key without knowing what sensibly comes next), it would, given enough time, produce any given result. The idea goes back a long way, and it’s blessed with a compelling mental image even though typewriters are a touch old-fashioned these days.

    It seems to have gotten its canonical formulation in Émile Borel’s 1913 article “Statistical Mechanics and Irreversibility”, as you might expect since statistical mechanics brings up the curious problem of entropy. In short: every physical interaction, say, when two gases — let’s say clear air and some pink smoke as 1960s TV shows used to knock characters out — mix, is time-reversible. Look at the interaction of one clear-gas molecule and one pink-gas molecule and you can’t tell whether it’s playing forward or backward. But look at the entire room and it’s obvious whether they’re mixing or unmixing. How can something be time-reversible at every step of every interaction but not in whole?

    The idea got a second compelling metaphor with Jorge Luis Borges’s Library of Babel, with a bit more literary class and in many printings fewer monkeys.

    (More …)

     
    • elkement 5:34 pm on Friday, 15 November, 2013 Permalink | Reply

      The last one is my favorite – probably because I am fascinated by conspiracy theories and people sending unsolicited manuscripts about their alleged refuting of Einstein.

      Like

  • Joseph Nebus 5:43 am on Thursday, 21 March, 2013 Permalink | Reply
    Tags: , , , statistical mechanics,   

    Gibbs’ Elementary Principles in Statistical Mechanics 


    I had another discovery from the collection of books at archive.org, now that I thought to look for it: Josiah Willard Gibbs’s Elementary Principles in Statistical Mechanics, originally published in 1902 and reprinted 1960 by Dover, which gives you a taste of Gibbs’s writings by its extended title, Developed With Especial Reference To The Rational Foundation of Thermodynamics. Gibbs was an astounding figure even in a field that seems to draw out astounding figures, and he’s a good candidate for the title of “greatest scientist to come from the United States”.

    He lived in walking distance of Yale (where his father and then he taught) nearly his whole life, working nearly isolated but with an astounding talent for organizing the many complex and confused ideas in the study of thermodynamics into a neat, logical science. Some great scientists have the knack for finding important work to do; some great scientists have the knack for finding ways to express work so the masses can understand it. Gibbs … well, perhaps it’s a bit much to say the masses understand it, but the language of modern thermodynamics and of quantum mechanics is very much the language he spoke a century-plus ago.

    My understanding is he published almost all his work in the journal Transactions of the Connecticut Philosophical Society, in a show of hometown pride which probably left the editors baffled but, I suppose, happy to print something this fellow was very sure about.

    To give some idea why they might have found him baffling, though, consider the first paragraph of Chapter 1, which is accurate and certainly economical:

    We shall use Hamilton’s form of the equations of motion for a system of n degrees of freedom, writing q_1, \cdots q_n for the (generalized) coördinates, \dot{q}_1, \cdots \dot{q}_n for the (generalized) velocities, and

    F_1 q_1 + F_2 q_2 + \cdots + F_n q_n [1]

    for the moment of the forces. We shall call the quantities F_1, \cdots F_n the (generalized) forces, and the quantities p_1 \cdots p_n , defined by the equations

    p_1 = \frac{d\epsilon_p}{d\dot{q}_1}, p_2 = \frac{d\epsilon_p}{d\dot{q}_2}, etc., [2]

    where \epsilon_p denotes the kinetic energy of the system, the (generalized) momenta. The kinetic energy is here regarded as a function of the velocities and coördinates. We shall usually regard it as a function of the momenta and coördinates, and on this account we denote it by \epsilon_p . This will not prevent us from occasionally using formulas like [2], where it is sufficiently evident the kinetic energy is regarded as function of the \dot{q}‘s and q‘s. But in expressions like d\epsilon_p/dq_1 , where the denominator does not determine the question, the kinetic energy is always to be treated in the differentiation as function of the p’s and q’s.

    (There’s also a footnote I skipped because I don’t know an elegant way to include it in WordPress.) Your friend the physics major did not understand that on first read any more than you did, although she probably got it after going back and reading it a touch more slowly. And his writing is just like that: 240 pages and I’m not sure I could say any of them could be appreciably tightened.


    Also, I note I finally reached 9,000 page views! Thank you; I couldn’t have done it without at least twenty of you, since I’m pretty sure I’ve obsessively clicked on my own pages at minimum 8,979 times.

     
    • Peter Mander 8:05 pm on Thursday, 21 March, 2013 Permalink | Reply

      Fully agree with your assessment of Gibbs’ greatness. The US should be immensely proud of him.

      Like

  • Joseph Nebus 5:46 pm on Tuesday, 5 March, 2013 Permalink | Reply
    Tags: , , , statistical mechanics,   

    Reblog: Mixed-Up Views Of Entropy 


    The blog CarnotCycle, which tries to explain thermodynamics — a noble goal, since thermodynamics is a really big, really important, and criminally unpopularized part of science and mathematics — here starts from an “Unpublished Fragment” by the great Josiah Willard Gibbs to talk about entropy.

    Gibbs — a strong candidate for the greatest scientist the United States ever produced, complete with fascinating biographical quirks to make him seem accessibly peculiar — gave to statistical mechanics much of the mathematical form and power that it now has. Gibbs had planned to write something about “On entropy as mixed-up-ness”, which certainly puts in one word what people think of entropy as being. The concept is more powerful and more subtle than that, though, and CarnotCycle talks about some of the subtleties.

    Like

    carnotcycle

    mixedup

    Tucked away at the back of Volume One of The Scientific Papers of J. Willard Gibbs, is a brief chapter headed ‘Unpublished Fragments’. It contains a list of nine subject headings for a supplement that Professor Gibbs was planning to write to his famous paper “On the Equilibrium of Heterogeneous Substances”. Sadly, he completed his notes for only two subjects before his death in April 1903, so we will never know what he had in mind to write about the sixth subject in the list: On entropy as mixed-up-ness.

    Mixed-up-ness. It’s a catchy phrase, with an easy-to-grasp quality that brings entropy within the compass of minds less given to abstraction. That’s no bad thing, but without Gibbs’ guidance as to exactly what he meant by it, mixed-up-ness has taken on a life of its own and has led to entropy acquiring the derivative associations of chaos and disorder…

    View original post 627 more words

     
  • Joseph Nebus 3:56 am on Monday, 4 February, 2013 Permalink | Reply
    Tags: absolute zero, , enthalpy, , , , statistical mechanics   

    Fun With General Physics 


    I’m sure to let my interest in the Internet Archive version of Landau, Akhiezer, and Lifshiz General Physics wane soon enough. But for now I’m still digging around and finding stuff that delights me. For example, here, from the end of section 58 (Solids and Liquids):

    As the temperature decreases, the specific heat of a solid also decreases and tends to zero at absolute zero. This is a consequence of a remarkable general theorem (called Nernst’s theorem), according to which, at sufficiently low temperatures, any quantity representing a property of a solid or liquid becomes independent of temperature. In particular, as absolute zero is approached, the energy and enthalpy of a body no longer depend on the temperature; the specific heats cp and cV, which are the derivatives of these quantities with respect to temperature, therefore tend to zero.

    It also follows from Nernst’s theorem that, as T \rightarrow 0 , the coefficient of thermal expansion tends to zero, since the volume of the body ceases to depend on the temperature.

     
  • Joseph Nebus 6:26 am on Sunday, 3 February, 2013 Permalink | Reply
    Tags: Internet Archive, , , soviet mathematics, statistical mechanics,   

    General Physics from the Internet Archive 


    Mir Books is this company that puts out downloadable, translated copies of mostly Soviet mathematics and physics books. As often happens I started reading them kind of on a whim and kept following in the faith that someday I’d see a math text I just absolutely had to have. It hasn’t quite reached that, although a post from today identified one I do like which, naturally enough, they aren’t publishing. It’s from the Internet Archive instead.

    The book is General Physics, by L D Landau, A I Akhiezer, and E M Lifshiz. The title is just right; it gets you from mechanics to fields to crystals to thermodynamics to chemistry to fluid dynamics in about 370 pages. The scope and size probably tell you this isn’t something for the mass audience; the book’s appropriate for an upper-level undergraduate or a grad student, or someone who needs a reference for a lot of physics.

    So I can’t recommend this for normal readers, but if you’re the sort of person who sees beauty in a quote like:

    Putting r here equal to the Earth’s radius R, we find a relation between the densities of the atmosphere at the Earth’s surface (nE) and at infinity (n):

    n_{\infty} = n_E e^{-\frac{GMm}{kTR}}

    then by all means read on.

     
    • elkement 11:56 am on Sunday, 3 February, 2013 Permalink | Reply

      Thanks a lot for this pointer – downloaded immediately! I do indulge in recapturing physics basics incl. all fields I am not concerned with on a daily basis. This book seems to be concise and really comprising many different sub-fields in physics. I am also a Landau-Lifshitz fan in general and theoretically I “ought to” own their land mark books on theoretical physics – but buying these would be quite an investment.

      Like

      • Joseph Nebus 2:06 am on Monday, 4 February, 2013 Permalink | Reply

        I’ve certainly downloaded my own copy. I remember calling on this book a couple of times in my thesis work, although I’m sorry to say I didn’t understand it as well as I really should have. I like to think I know it better now, though.

        Like

  • Joseph Nebus 10:57 pm on Tuesday, 27 November, 2012 Permalink | Reply
    Tags: , , , , statistical mechanics   

    The Music Goes Round And Round 


    So. The really big flaw in my analysis of an “Infinite Jukebox” tune — one in which the song is free to jump between two points, with a probability of \frac13 of jumping from the one-minute mark to the two-minute mark, and an equal likelihood of jumping from the two-minute mark to the one-minute mark — and my conclusion that, on average, the song would lose a minute just as often as it gained one and so we could expect the song to be just as long as the original, is that I made allowance for only the one jump. The three-minute song with two points at which it could jump, which I used for the model, can play straight through with no cuts or jumps (three minutes long), or it can play jumping from the one-minute to the two-minute mark (a two minute version), or it can play from the start to the second minute, jump back to the first, and continue to the end (a four minute version). But if you play any song on the Infinite Jukebox you see that more can happen.

    (More …)

     
  • Joseph Nebus 5:19 pm on Thursday, 22 November, 2012 Permalink | Reply
    Tags: , , , , statistical mechanics   

    Infinite Buggles 


    Working through my circle of friends have been links to The Infinite Jukebox, an amusing web site which takes a song, analyzes points at which clean edits can be made, and then randomly jumps through them so that the song just never ends. The idea is neat, and its visual representation of the song and the places where it can — but doesn’t have to — jump forward or back can be captivating. My Dearly Beloved has been particularly delighted with the results on “I Am A Camera”, by the Buggles, as it has many good edit points and can sound quite natural after the jumps if you aren’t paying close attention to the lyrics. I recommend playing that at least a bit so you get some sense of how it works, although listening to an infinitely long rendition of the Buggles, or any other band, is asking for a lot.

    One question that comes naturally to mind, at least to my mind, is: given there are these various points where the song can skip ahead or skip back, how long should we expect such an “infinite” rendition of a song to take? What’s the average, that is the expected value, of the song’s playing? I wouldn’t dare jump into analyzing “I Am A Camera”, not without working on some easier problems to figure out how it should be done, but let’s look.

    (More …)

     
  • Joseph Nebus 6:03 am on Saturday, 20 October, 2012 Permalink | Reply
    Tags: coulomb gas, , , random matrix, statistical mechanics   

    Reblog: Random matrix theory and the Coulomb gas 


    inordinatum’s guest blog post here discusses something which, I must confess, isn’t going to be accessible to most of my readers. But it’s interesting to me, since it addresses many topics that are either directly in or close to my mathematical research interests.

    The random matrix theory discussed here is the study of what we can say about matrices when we aren’t told the numbers in the matrix, but are told the distribution of the numbers — how likely any cell within the matrix is to be within any particular range. From that start it might sound like almost nothing could be said; after all, couldn’t anything go? But in exactly the same way that we’re able to speak very precisely about random events in the context of probability and statistics — for example, that a (fair) coin flipped a million times will come up tails pretty near 500,000 times, and will not come up tails 600,000 times — we’re able to speak confidently about the properties of these random matrices.

    In any event, please do not worry about understanding the whole post. I found it fascinating and that’s why I’ve reblogged it here.

    Like

    inordinatum

    Today I have the pleasure of presenting you a guest post by Ricardo, a good physicist friend of mine in Paris, who is working on random matrix theory. Enjoy!

    After writing a nice piece of hardcore physics to my science blog (in Portuguese, I am sorry), Alex asked me to come by and repay the favor. I am happy to write a few lines about the basis of my research in random matrices, and one of the first nice surprises we have while learning the subject.

    In this post, I intent to present you some few neat tricks I learned while tinkering with Random Matrix Theory (RMT). It is a pretty vast subject, whose ramifications extend to nuclear physics, information theory, particle physics and, surely, mathematics as a whole. One of the main questions on this subject is: given a matrix $latex M$ whose entries are taken randomly from a…

    View original post 1,082 more words

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: