Hi Everyone.. I hope you are having a good day...:-)
Here is my first image processed by my new rebuild refurbished computer.
This is the Elephant's Trunk Nebula.
While my computer was down I focused a lot of time and attention on this object with my Canon 300mm FD F2.8 SSC Fluorite lens and my 2 Canon modified cameras.
This is over 15hr’s of 150 images stacked at ISO 400 at 6 minutes each plus just under 4 hr of 39 images shot with the Optolong 7nm H-Alpha filter with both images merged in Photoshop using the HA data as a second channel along with the RGB.
I hope all of you like it... :-) The details are below..
Have a good day and clear skies everyone. :-)
The Elephant's Trunk nebula is a concentration of interstellar gas and dust within the much larger ionized gas region IC 1396 located in the constellation Cepheus about 2,400 light years away from Earth. The piece of the nebula shown here is the dark, dense globule IC 1396A; it is commonly called the Elephant's Trunk nebula because of its appearance at visible light wavelengths, where there is a dark patch with a bright, sinuous rim. The bright rim is the surface of the dense cloud that is being illuminated and ionized by a very bright, massive star (HD 206267) that is just to the west of IC 1396A. The entire IC 1396 region is ionized by the massive star, except for dense globules that can protect themselves from the star's harsh ultraviolet rays.
The Elephant's Trunk nebula is now thought to be a site of star formation, containing several very young (less than 100,000 yr) stars that were discovered in infrared images in 2003. Two older (but still young, a couple of million years, by the standards of stars, which live for billions of years) stars are present in a small, circular cavity in the head of the globule. Winds from these young stars may have emptied the cavity.
The combined action of the light from the massive star ionizing and compressing the rim of the cloud, and the wind from the young stars shifting gas from the centre outward lead to very high compression in the Elephant's Trunk nebula. This pressure has triggered the current generation of Protostars.
Camera: Canon EOS T2i/550D Modified And TEC Cooled
Lens: Canon 300mm FD F2.8 SSC Fluorite Set At F4 Modified For Canon EF Mount
Exposure: 6 Minutes Each
Number of Stacked Images: 150
Number of Dark Frames: 0
Number of Bias Frames: 0
Filters: Baader UV/IR cut filter
Mount: Takahashi EM-200 Temma 2
Guide Scope: Non
Stacking Software: DeepSkyStacker
Processing Software: Photoshop CS6 And Adobe Camera Raw
Shooting Date/Time 7/15/2017 10:09 PM
Shooting Date/Time 7/18/2017 10:35 PM
Shooting Date/Time 7/19/2017 10:19 PM
Camera: Canon EOS T3i/600D Fully Modified
Lens: Canon 300mm FD F2.8 SSC Fluorite Set At F2.8 Modified For Canon EF Mount
Exposure: 6 Minutes Each
Number of Stacked Images: 39
Number of Dark Frames: 0
Number of Bias Frames: 0
Filters: Optolong EOS-C 7nm H-Alpha
Mount: Takahashi EM-200 Temma 2
Guide Scope: Non
Stacking Software: DeepSkyStacker
Processing Software: Photoshop CS6 And Adobe Camera Raw
Shooting Date/Time 7/22/2017 11:38 PM
Saturday, 29 July 2017
Hi Everyone.. I hope you are having a good day...:-)
Wednesday, 12 July 2017
Last year, the existence of an unknown planet in our Solar system was announced. However, this hypothesis was subsequently called into question as biases in the observational data were detected. Now Spanish astronomers have used a novel technique to analyse the orbits of the so-called extreme trans-Neptunian objects and, once again, they point out that there is something perturbing them: a planet located at a distance between 300 to 400 times the Earth-Sun separation.
Scientists continue to argue about the existence of a ninth planet within our Solar System. At the beginning of 2016, researchers from the California Institute of Technology (Caltech, USA) announced that they had evidence of the existence of this object, located at an average distance of 700 AU or astronomical units (700 times the Earth-Sun separation) and with a mass ten times that of the Earth.
Their calculations were motivated by the peculiar distribution of the orbits found for the trans-Neptunian objects (TNO) of the Kuiper belt, which apparently revealed the presence of a Planet Nine or X in the confines of the Solar System.
However, scientists from the Canadian-French-Hawaiian project OSSOS detected biases in their own observations of the orbits of the TNOs, which had been systematically directed towards the same regions of the sky, and considered that other groups, including the Caltech group, may be experiencing the same issues. According to these scientists, it is not necessary to propose the existence of a massive perturber (a Planet Nine) to explain these observations, as these are compatible with a random distribution of orbits.
Now, however, two astronomers from the Complutense University of Madrid have applied a new technique, less exposed to observational bias, to study a special type of trans-Neptunian objects: the extreme ones (ETNOs, located at average distances greater than 150 AU and that never cross Neptune's orbit). For the first time, the distances from their nodes to the Sun have been analysed, and the results, published in the journal 'MNRAS: Letters', once again indicate that there is a planet beyond Pluto.
The nodes are the two points at which the orbit of an ETNO, or any other celestial body, crosses the plane of the Solar System. These are the precise points where the probability of interacting with other objects is the largest, and therefore, at these points, the ETNOs may experience a drastic change in their orbits or even a collision.
Like the comets that interact with Jupiter
"If there is nothing to perturb them, the nodes of these extreme trans-Neptunian objects should be uniformly distributed, as there is nothing for them to avoid, but if there are one or more perturbers, two situations may arise," explains Carlos de la Fuente Marcos, one of the authors, to SINC. "One possibility is that the ETNOs are stable, and in this case they would tend to have their nodes away from the path of possible perturbers, he adds, but if they are unstable they would behave as the comets that interact with Jupiter do, that is tending to have one of the nodes close to the orbit of the hypothetical perturber".
Using calculations and data mining, the Spanish astronomers have found that the nodes of the 28 ETNOs analysed (and the 24 extreme Centaurs with average distances from the Sun of more than 150 AU) are clustered in certain ranges of distances from the Sun; furthermore, they have found a correlation, where none should exist, between the positions of the nodes and the inclination, one of the parameters which defines the orientation of the orbits of these icy objects in space.
"Assuming that the ETNOs are dynamically similar to the comets that interact with Jupiter, we interpret these results as signs of the presence of a planet that is actively interacting with them in a range of distances from 300 to 400 AU," says De la Fuente Marcos, who emphasizes: "We believe that what we are seeing here cannot be attributed to the presence of observational bias".
Until now, studies that challenged the existence of Planet Nine using the data available for these trans-Neptunian objects argued that there had been systematic errors linked to the orientations of the orbits (defined by three angles), due to the way in which the observations had been made. Nevertheless, the nodal distances mainly depend on the size and shape of the orbit, parameters which are relatively free of observational bias.
"It is the first time that the nodes have been used to try to understand the dynamics of the ETNOs", the co-author points out, as he admits that discovering more ETNOs (at the moment, only 28 are known) would permit the proposed scenario to be confirmed and subsequently constrain the orbit of the unknown planet via the analysis of the distribution of the nodes.
The authors note that their study supports the existence of a planetary object within the range of parameters considered both in the Planet Nine hypothesis of Mike Brown and Konstantin Batygin from Caltech, and in the original one proposed in 2014 by Scott Sheppard from the Carnegie Institute and Chadwick Trujillo from the University of North Arizona; in addition to following the lines of their own earlier studies (the latest led by the Institute de Astrofísica de Canarias), which suggested that there is more than one unknown planet in our Solar System.
Is there also a Planet Ten?
De la Fuente Marcos explains that the hypothetical Planet Nine suggested in this study has nothing to do with another possible planet or planetoid situated much closer to us, and hinted at by other recent findings. Also applying data mining to the orbits of the TNOs of the Kuiper Belt, astronomers Kathryn Volk and Renu Malhotra from the University of Arizona (USA) have found that the plane on which these objects orbit the Sun is slightly warped, a fact that could be explained if there is a perturber of the size of Mars at 60 AU from the Sun.
"Given the current definition of planet, this other mysterious object may not be a true planet, even if it has a size similar to that of the Earth, as it could be surrounded by huge asteroids or dwarf planets," explains the Spanish astronomer, who goes on to say: "In any case, we are convinced that Volk and Malhotra's work has found solid evidence of the presence of a massive body beyond the so-called Kuiper Cliff, the furthest point of the trans-Neptunian belt, at some 50 AU from the Sun, and we hope to be able to present soon a new work which also supports its existence."
Tuesday, 11 July 2017
The more observations we do, the more corrections we bring and when we start making absurd concepts about something we come more close to the truth. So keep making concepts because it's a blessing to us that we can think.
Posted by Adityadhar Dwivedi
Before I start discussion about Dark energy, you might have studied in high schools' books that everything in this universe is made up of atoms and that's simply called matter. So questions arises, Is only normal or baryonic matter present in whole universe? The simple answer I.e. based on our current studies tells No. It's only tiny amount of the energy or mass occupying whole universe that's only little ~5%. That's too crazy and horrible that all matter we see, we touch, feel and observe is only 5%. Then what's more? Actually, it is 95% of energy density of universe unexplained. In previous post I had discussed about Dark Matter which should not be called matter because it's gravity that is invisible so simply can be called Dark Gravity. [Chech out previous one] So, what about Dark Energy? I am here and you are also here to start the discussion. So, let's start!Dark Energy is a mysterious energy which is an anti-gravitational energy which is only responsible for the expansion of our universe which is much more than the speed of light. Here you'll try to contradict this statement by telling that Einstein had set a limit of the speed of light. Means nothing could travel faster than light. But, it's the expansion rate of our universe, not an object but again one question arises, Our Universe is expanding into what? What is it?
Most physicists think of 11-dimensional Hyperspace in which there are universes are floating like soap bubbles and each soap bubble represents each universe of different dimensions and they are vibrating like membrane [called M-brane Theory]. So, this theory is really hypothetical but o understand this theory well, you have to make your imagination of Hyperspace of 11 dimensions. Now let's move further.
Ok, Dark Energy, which type of feelings do you create while hearing this word? Maybe, mysterious or something unknown or strange phenomenon. Well, you're quite right. Dark Energy is named after a mysterious force which is accelerating up the expansion of our universe which is present in our universe having almost ~70% of the energy density of the universe. So what is something strange about this?
My friend, the whole concept is so strange because it's much more mysterious than Dark Matter [really should be called as Dark Gravity ] because we neither have detected it nor we have any strong causes to support its existence. So you may suppose, how strange the Dark Energy is!
Unravelling the secrets of Dark Energy
Actually, I am now forwarding you my concepts and theories regarding the mystery unravelling Dark Energy.
[Follow this link: https://youtu.be/mH91InyS6Uw]
Think of a universe at the age of a Singularity which was ultra-dense form of the cosmos, at the Planck scale of time and measurement. At that time, that is about 10 power to -43 second, sudden quantum fluctuations took place and that created a vast explosion or Bang in which everything got originated in the form of grand unified force which ultimately got separated into 4 fundamental forces. At the time when universe was a baby, there was an equal amount of soup of particles equals to space itself. Everything was in a dense soup of particles that could be termed as, Quark-Gluon Plasma. At that time, it was possible that both matter and antimatter could have originated and by probability, matter got victory over the influence of antimatter [It is also a mystery but wait for the upcoming posts regarding this]. So where is the existence of Dark Energy in the earliest phases of universe? Think of that time when sudden quantum fluctuations took place in Singularity and it banged with such a greatest flow or shockwave in the fabric of the cosmos/space-time which could really have powered the expansion of space. At that time, that shockwave of the Big Bang would have reached to the edges and it must have powered the expansion with an unimaginable rate that has helped the particles ( and antiparticles) to annihilated together to form pure energy as well as expansion of space provided enough temperature to bind to form atoms and further collapse into stars.
That's only concept......... :)
- Let suppose whole matter as x and whole antimatter as y. We are made up of baryons which means we are part of x then how could we have formed if x and y would become already 0.
- So you could imagine 0.0000000001% of x equals to 5% of energy density of the universe then what about the total volume of x and y. Suppose if x and y are non-annihilating then it wouldn't become 0 and Is this possible that so far the expansion of our universe is less space for both x and y to bind together and form atoms?
Monday, 10 July 2017
WASHINGTON — A long-overdue fiscal year 2017 spending bill unveiled early May 1 will provide NASA with $19.65 billion, more than $600 million above the original request for the agency by the previous administration.
The omnibus spending bill, released by congressional appropriators after extended negotiations, provides more money overall for the agency than earlier House and Senate bills, including significant increases for exploration programs and planetary science. It also funds programs that the Trump administration seeks to cancel or restructure in its 2018 budget proposal.
The $19.653 billion NASA receives in the bill is $628 million above the original request for the agency in the Obama administration’s final budget request in February 2016. It is $368 million above the $19.285 billion NASA received in fiscal year 2016.
The biggest winner in the spending bill is NASA’s exploration program, which gets $4.32 billion, nearly $1 billion more than the original request but similar to what the House and Senate offered in their bills last year. That total includes $2.15 billion for the Space Launch System and $1.35 billion for Orion.
The report accompanying the spending bill allows NASA to use exploration funding to support technologies such as advanced proposal, asteroid deflection and grappling systems intended for use on the Asteroid Redirect Mission (ARM), provided they “not distract from the overarching goal of sending humans to Mars.” The Trump administration’s fiscal year 2018 budget blueprint, released March 16, announced plans to cancel ARM.
Science programs will receive $5.76 billion in the spending bill, above both the requested $5.6 billion and lower levels in the House and Senate bills. Planetary science wins a large increase, to nearly $1.85 billion, well above the 2017 request of $1.52 billion and the $1.63 billion it received in 2016. That total includes $408 million for the Mars 2020 rover mission, including language directing NASA to add a small helicopter technology demonstration to the mission as long as it does not delay the mission’s launch.
That planetary science funding also includes $275 million for Europa missions, both the Europa Clipper multiple flyby spacecraft and a proposed lander. Language in the bill requires NASA to launch Europa Clipper no later than 2022 and the lander no later than 2024, although NASA officials have recently said they don’t expect the lander mission to be ready for launch until at least 2025. The Trump administration’s 2018 budget blueprint supported Europa Clipper but included no funding for a Europa lander.
NASA’s Earth science program, the subject of potential cuts, received $1.92 billion, the same as it received in 2016 but less than the $2.03 billion sought by the Obama administration. That funding includes $90 million for Pre-Aerosol, Clouds, and Ocean Ecosystem, or PACE, mission, which the Trump administration targeted for cancellation in its 2018 budget blueprint.
NASA’s space technology program receives $686.5 million in the bill, the same as it received in 2016 but less than $826.7 million requested by the Obama administration. Of that, $130 million is set aside for the Restore-L satellite servicing project, which the Trump administration said in its 2018 budget blueprint that it seeks to restructure, calling it “duplicative.”
Space operations, which includes the International Space Station and related projects, receives $4.95 billion in the bill, the same as the Senate offered in its bill but $125 million less than requested. Commercial crew, part of space operations, will receive $1.185 billion, the same amount as requested.
The bill also provides $100 million for NASA’s Office of Education, the same as the original request. The Trump administration, in its 2018 budget blueprint, seeks to close the office and focus NASA’s educational activities through programs in the science mission directorate.
The full omnibus appropriations bill, which funds other federal government agencies besides NASA, was released exactly seven months into the 2017 fiscal year. Those agencies had been operating under continuing resolutions (CRs) that funded programs at 2016 levels. The latest CR, passed April 28, provided a one-week extension until May 5 to give appropriators time to finalize the omnibus bill.
Stellar nurseries, the birthplace of new stars, are not as cozy and colour-coordinated as Pinterest nurseries. Stellar nurseries feature dust and gas rather than lovable characters and perfect shades of blue or pink—cold expanses rather than cozy nooks.
As scientists have pieced together the story of how stars form, a model has emerged that highlights the role of a strong magnetic field. However, research recently published in The Astrophysical Journal Letters reveals that stellar nurseries may have environments that are much more varied and complex than previously thought. This information could help us better understand how stars like our sun form.
Artist impression of chaotic magnetic field lines very near a newly emerging protostar.
Image Credit: NRAO/AUI/NSF; D. Berry
Before we get to these results, it helps to visualize where stellar nurseries form. It’s not just emptiness that fills the space between stars systems, this space is filled with a dilute mixture of gas and dust called the interstellar medium. Some areas are denser than others, forming giant interstellar clouds. (As a side note, NASA and the University of Colorado, Boulder launched a new rocket payload just last Tuesday that gathered data on the interstellar cloud between us and the star Beta Scorpii - details here.)
Under certain conditions, the gravity of an interstellar cloud can overcome the pressure of the gas within it and cause the cloud to collapse and break into pieces. These pieces become the birthplace of new stars as gravity pulls in more and more gas, creating increasingly dense, hot objects—Protostars on the path to becoming full-fledged stars. During the protostar phase, a star-to-be is still pulling in mass from the surrounding interstellar cloud and hasn’t started fusion reactions yet.
This isn’t the whole story, however. Many pieces of interstellar clouds that should form stars according to gravity, do not. The process appears to be shaped by things like the turbulence and magnetic fields of the cloud. Although the details of their influence aren’t well understood, typical models suggest that magnetic fields play the dominant role in regulating star formation.
To better understand the impact of magnetic fields on the birth of stars, scientists in the global ALMA collaboration, the Atacama Large Millimeter/Submillimeter Array, turned their instruments on an object called Ser-emb 8. Ser-emb 8 is about 1,400 light years away. It’s a typical young protostar in a region where many stars are forming.
Located in the Atacama Desert of Chile, ALMA is a collection of ground-based telescopes (very precise antennas) that together study the universe in millimetre and submillimetre wavelengths—the range of light between infrared light and radio waves. This light contains valuable information on things like the origins of galaxies, stars, planets, and the molecules related to life.
ALMA can’t “see” magnetic fields directly. Instead it maps the polarization of the light given off by warm grains of dust that tend to align with the magnetic field of a region. Using this polarization map, scientists can infer the magnetic field in an area. Thanks to ALMA’s precise instruments, the Ser-emb 8 measurement is the most sensitive measurement scientists have ever made of a small-scale magnetic field around a young protostar.
The result was an unexpected surprise.
This texture represents the magnetic field orientation in the region surrounding the Ser-emb 8 protostar, as measured by ALMA. The Gray region is the millimetre wavelength dust emission.
Image Credit: ALMA (ESO/NAOJ/NRAO); P. Mocz, C. Hull, CfA.
Previous research suggests that stars typically form in regions with strong magnetic fields. When observing a young protostar, this is evidenced by a magnetic field with an hourglass-shape, and astronomers have observed this before. However, the team, which was led by Charles Hull from the Harvard-Smithsonian Center for Astrophysics, found that Ser-emb 8 didn’t fit the model. Although it is clearly a young protostar, there is no hourglass in sight: The magnetic field of Ser-emb 8 is chaotic, randomly oriented, and doesn’t match up to the large-scale magnetic field of the region.
To better understand this result and its implications, the team ran simulations of an interstellar cloud collapsing and forming a young protostar. Each simulation featured magnetic fields and turbulence of different strengths. From these simulations, the team created mock observations of the magnetic field.
By comparing the mock observations to the real ones, astronomers found that Ser-emb 8 is likely forming from the collapse of an interstellar cloud with a weak magnetic field. The star formation and the magnetic field of Ser-emb 8 appears to be controlled by turbulence within the cloud, not a strong magnetic field.
The implications of this result go beyond describing the environment in Ser-emb 8’s tiny piece of the sky. With this observation, astronomers have demonstrated that stars can form under a wider variety of conditions than previously thought. In other words, stellar nurseries may display more of nature’s creativity than we realized.
The commitment of China and its emerging companies to the sharing economy policy has reached outer space, thanks to the initiative of a company that designs a telescope that can be rented by astronomy buffs.
"We want everyone to be able to access that technology because the satellites are very expensive and are beyond the reach of most people," Feng Yang, president of Chinese company Spacety, a nanosatellites manufacturer, told EFE.
The telescope will be launched into space on a satellite, and its functions can be controlled by users via its website, which will allow them to browse for images they want to obtain.
The company's objective is to bring space to ordinary people as well as to create opportunities for those interested in areas such as astronomy to investigate or simply to enjoy their passion without having to pay the high costs.
"At a dinner with my two friends who are astronomy enthusiasts, I realised that they always spent a lot of money on telescopes and that surprised me a lot. They spent a million yuan for a telescope ($A193,000)," he said.
After the conversation with his friends, he concluded that if he could put a telescope into space, in the style of a mini-Hubble, "the images that can be captured are going to be better."
According to Feng, this is the first company in the world that will introduce such a business model, since "Hubble belongs to the US government."
"There is no telescope that is open for everyone. With ours, anyone in the world can get to our website, control the telescope in space and take a look at wherever they want," he said.
Although the exact amount is currently unknown, the investment in the project is expected to be more than 10 million yuan.
This idea, the entrepreneur said, is in line with China's prominent interest in promoting the "sharing economy" policy, aimed at distributing resources of everyday things to reduce costs and allowing people to enjoy services that were previously unaffordable.
Researchers and students in the Graphene Flagship are preparing for two exciting experiments in collaboration with the European Space Agency (ESA) to test the viability of graphene for space applications. Both experiments will launch between 6-17th November 2017, testing graphene in zero-gravity conditions to determine its potential in space applications including light propulsion and thermal management.
The Graphene Flagship is a pan-European research initiative dedicated to developing new technologies based on graphene, the single-atom-thick allotrope of carbon with excellent electrical, mechanical, thermal and optical properties. A fundamental aspect of the Graphene Flagship is training students and young researchers. These ambitious space-related experiments are an excellent opportunity for Flagship students and researchers to gain new experiences in cutting-edge research. Join the Graphene Flagship as we follow the progress -- from the early stages in the laboratory to the moments of weightlessness!
In a fully student-led experiment, a team of Graphene Flagship graduate students from Delft Technical University (TU Delft; Netherlands) will participate in ESA Education's Drop Your Thesis! programme. Their successful proposal will use microgravity conditions in the ZARM Drop Tower (Bremen, Germany) to test graphene for light sails. By shining laser light on suspended graphene-membranes from Flagship partner Graphene, the experiment will test how much thrust can be generated, which could lead to a new way of propelling satellites in space using light from lasers or the sun.
The PhD student team -- named GrapheneX -- consists of Santiago Cartamil Bueno, Davide Stefani, Vera Janssen, Rocco Gaudenzi, all research students in Herre van der Zant's research group in TU Delft. Santiago Cartamil Bueno, project leader for the GrapheneX team, said "We split tasks between the team and things are working well. We are very ambitious with the quality of the experiments. We really want to do it properly, so we are committed to do real science in this project."
ESA Education's Drop Your Thesis! programme offers students the opportunity to design an experiment for the ZARM Drop Tower in Bremen, Germany, which simulates the low gravity and vacuum conditions of space. The 146 m ZARM Drop Tower creates extreme microgravity conditions down to one millionth of Earth's gravitational force. In vacuum, a capsule containing the experiment is catapulted up and down the tower, providing a total of 9.3 seconds of weightlessness.
Running concurrently is an experiment investigating how graphene can improve efficiency in heat transfer in loop heat pipes -- cooling systems used extensively in satellites and aerospace instruments. The experiment is a collaboration between Graphene Flagship partners at the Microgravity Research Centre, University libre de Bruxelles, Belgium; the Cambridge Graphene Centre, University of Cambridge, UK; Institute for Organic Synthesis and Photo reactivity, National Research Council of Italy (CNR), Italy; and Leonardo Spa, Italy, a global leader in aerospace, producing of a variety of components and systems for space applications.
A significant part of the loop heat pipe is the wick, typically made of porous metal. In this experiment, the wicks will be coated with different types of graphene-related materials to improve the efficiency of the heat pipe. The coated wicks will be tested in a low-gravity parabolic flight operated by ESA in partnership with Novespace, France. During each 3-hour flight, the specially modified plane will make a series of 30 parabolic ascents with around 25 seconds of weightlessness in each parabola.
Involved in the experiment are Graphene Flagship researchers Vanja Miškovi? and Fabio Iermano, both working at the Microgravity Research Centre, and Lucia Lombardi and Yarjan Samad, both at the Cambridge Graphene Centre. As well as the on-ground experiments, the young researchers will experience weightlessness on board the low-gravity flights in November.
"I'm really excited because this will be my first zero gravity experience," said Lombardi. "The idea is to use graphene to improve the thermal conductivity and the capillary pressure by growing a sponge in the pores of the wicks," she added.
"We want to test different kinds of coatings since the graphene and graphene oxide have different properties, but we are hoping to achieve good results with both of the coatings," added Miškovi?. "I'm very excited, I know that not a lot of people get this opportunity."
Andrea Ferrari (University of Cambridge), Science and Technology Officer of the Graphene Flagship and chair of its management panel added "Space is the new frontier for the Graphene Flagship. These initial experiments will test the viability of graphene-enabled devices for space applications. The combined strengths of the Graphene Flagship, Flagship partners and the European Space Agency as well global leader in aerospace applications Leonardo, give a strong basis to reach a high technology readiness level."
Jari Kinaret (Chalmers University of Technology, Sweden), Director of the Graphene Flagship, said "These two projects exemplify the two-fold character of the Graphene Flagship: the loop heat pipe project is targeting a specific application, while the light sail project is firmly linked to basic research and builds upon the unique combination of properties that only graphene can offer. I am particularly proud of the fact that one of these projects was initiated by students working on area completely disconnected from space applications: this demonstrates the creativity of the next generation of researchers, and shows the sometimes surprising links between different parts of our Flagship -- or maybe I should say spaceship?"
Sunday, 9 July 2017
Got a little bored LOL. The full Moon is out so there is very little to do until I receive my 3 Optolong filters I have ordered so I can start doing narrow band imaging when the Moon is out.
So I went and puled up some old data of Comet C2013 US10 I shot last year and did a little reprocessing on the data. Not to bad if I may say.
Comments from January 17, 2016
I had a very good night last night shooting the conjunction between Comet C2013 US10 Catalina and The Pinwheel Galaxy M101.
This is one single frame out of 163 Sub frames I shot last night LOL It’s fantastic !!! plus I’m going to work on a new video. With all those frames being almost the same it will be great but it will take time to process all 163 frames the same way I did this one LOL wish me luck.
The details are below.
Have a good day and clear skies everyone.
Camera: Canon EOS T3i/600D (Un-Modified)
Lens: Canon 75-300mm set at 150mm F5.0
Exposure: 2 Minutes (120 seconds) Single Frame
Mount: Celestron CG4 with Clock Drive
Processing Software: Photoshop CS6, Camera Raw
Shooting Date/Time 1/17/2016 4:02:35 AM See more
Saturday, 8 July 2017
We're building the universe.
Michael E. Price
Senior Lecturer in Psychology and Director, Centre for Culture and Evolution, Brunel University London
Does humanity exist to serve some ultimate, transcendent purpose? Conventional scientific wisdom says no.
As physicist Lawrence Krauss puts it in his latest book, our evolution on this planet is just a "cosmic accident". If you believe otherwise, many would accuse you of suffering from some kind of religious delusion.
I don't think this view of life is necessarily correct. Despite this, my worldview is entirely naturalistic – it doesn't rely on invoking any supernatural powers. And I usually do agree with conventional scientific wisdom.
However, I know of one possible mechanism by which life could, in fact, be endowed with a natural purpose. The idea, just published in the journal Complexity, is highly speculative but worth considering.
In biological natural selection, genes' ability to replicate themselves depends on how well they can encode traits that permit organisms to out-reproduce other members of their own species.
Such traits – for example camouflage to avoid predators or eyes to enable vision – are adaptations to the environment, as opposed to traits that are just by-products of adaptations or random genetic noise.
Clearly, the purpose of these adaptations is to solve difficult problems (like seeing, digesting or thinking).
Because organisms are bundles of complex adaptations, they are the most improbably complex things in the universe. And improbable complexity is, in fact, the hallmark of natural selection – the fundamental way in which we recognise that a trait actually is an adaptation.
This makes them improbably low in 'entropy', which is the degree of disorder in a physical system. A basic law of physics is that entropy tends to always be increasing so that systems become more disordered (known as the "second law of thermodynamics").
It's because of this law that you can crack an egg and mix it all together to make an omelette (making it more disordered), but you can't turn the omelette neatly back into an egg with shell, white and yolk (making it more ordered).
Because natural selection is the process that "designs" organisms – incrementally organising random, disordered matter into complex, functional organs – it is the most powerful anti-entropic process that we know of.
Without the incremental changes that natural selection allows, the only way a complex adaptation like a mammalian eye could come into existence would be as the result of random chance. And the likelihood of that is extremely low.
Biological natural selection explains how adaptations have purpose (to facilitate survival and reproduction), and why organisms behave purposefully.
It does not explain, however, how life in general could have any transcendent purpose. To figure out the point of our existence we require a higher-order explanation, like the one I describe.
My higher-order explanation is based on cosmologist Lee Smolin's theory of cosmological natural selection. Smolin founded his theory on the increasingly popular view that our universe exists in an innumerably vast population of replicating universes – a multiverse.
Many physicists put stock in the idea of there being a multiverse, because its existence is predicted by eternal inflation, our most promising model of universe origins.
Smolin reasoned that in a multiverse, universes that were better at reproducing would become more common. He proposed that they could be created from existing black holes.
And if black holes are how universes reproduce, then cosmological natural selection would favour universes that contained more black holes.
In this theory, life is simply the accidental by-product of processes "designed" by selection to produce black holes.
Smolin's theory has considerable intuitive appeal. It seems analogous to Darwin's selection theory. And black holes do seem to be likely candidates to give birth to new universes.
A black hole is an infinitely small concentration of space-time, matter and energy – a singularity. And it's exactly this type of phenomenon we believe the Big Bang started from.
In one glaring aspect, however, Smolin's theory falls short of being analogous to Darwin's. It does not predict that the most improbably complex feature of our universe will be the one most likely to be an adaptation produced by cosmological natural selection.
Because that least entropic feature is life rather than black holes.
Smolin does identify life as the least entropic known thing. His theory, however, does not make the connection between entropy and selection.
That is, it doesn't acknowledge that just as improbably low entropy is the hallmark of selection operating at the biological level, this is likely to be true at the cosmological level as well.
The future of life
If life is, in fact, the universe's reproductive system, the implication is that sufficiently evolved intelligence could acquire the ability to create new cosmic environments.
In order to be habitable, these baby universes would need to replicate the physical laws of the life form's native universe. Cosmologists expect that in billions of years, our universe will cease being habitable.
By that point, however, life could conceivably have become intelligent enough to produce new life-supporting universes, perhaps by civilisations "building" something similar to black holes.
However, scientists currently lack the methods to test the idea conclusively. A start would be to discover that there are indeed other universes – something that astronomers are currently looking for.
A basic prediction it does make, however, is that human technological progress is likely to continue into the vastly distant future.
If cosmological selection "designed" life to use its technology for universe reproduction, then it seems reasonable to expect that life will succeed in this regard – just as you'd expect an eye produced by biological selection to actually succeed in seeing.
That doesn't mean that unceasing technological progress is guaranteed – after all, we could use our technology to destroy ourselves. Nevertheless we can reasonably expect humanity – or whatever it evolves into – to be sticking around for a long, long time.
The ConversationIt's not a new idea to propose in general terms that life might constitute a mechanism for cosmological evolution – good histories of this idea are here and here.
The new aspect of my research is that it spells out exactly why life – as the least
entropic known thing in the universe – is more likely than black holes (or anything else) to be a mechanism of universe reproduction. I hope others will continue to explore this idea.
Kat Volk (Left) and Renu Malhotra (Right)
An unknown, unseen "planetary mass object" may lurk in the outer reaches of our solar system, according to new research on the orbits of minor planets to be published in the Astronomical Journal. This object would be different from — and much closer than — the so-called Planet Nine, a planet whose existence yet awaits confirmation.
In the paper, Kat Volk (Above Left) and Renu Malhotra (Above Right) of the University of Arizona's Lunar and Planetary Laboratory, or LPL, present compelling evidence of a yet-to-be- discovered planetary body with a mass somewhere between that of Mars and Earth. The mysterious mass, the authors show, has given away its presence — for now — only by controlling the orbital planes of a population of space rocks known as Kuiper Belt objects, or KBOs, in the icy outskirts of the solar system.
While most KBOs — debris left over from the formation of the solar system — orbit the sun with orbital tilts (inclinations) that average out to what planetary scientists call the invariable plane of the solar system, the most distant of the Kuiper Belt's objects do not. Their average plane, Volk and Malhotra discovered, is tilted away from the invariable plane by about eight degrees. In other words, something unknown is warping the average orbital plane of the outer solar system.
"The most likely explanation for our results is that there is some unseen mass," says Volk, a postdoctoral fellow at LPL and the lead author of the study. "According to our calculations, something as massive as Mars would be needed to cause the warp that we measured."
The Kuiper Belt lies beyond the orbit of Neptune and extends to a few hundred Astronomical Units, or AU, with one AU representing the distance between Earth and the sun.
Like its inner solar system cousin, the asteroid belt between Mars and Jupiter, the Kuiper Belt hosts a vast number of minor planets, mostly small icy bodies (the precursors of comets), and a few dwarf planets.
For the study, Volk and Malhotra analysed the tilt angles of the orbital planes of more than 600 objects in the Kuiper Belt in order to determine the common direction about which these orbital planes all precess. Precession refers to the slow change or "wobble" in the orientation of a rotating object.
KBOs operate in an analogous way to spinning tops, explains Malhotra, who is a Louise Foucar Marshall Science Research Professor and Regents' Professor of Planetary Sciences at LPL.
"Imagine you have lots and lots of fast-spinning tops, and you give each one a slight nudge," she says. "If you then take a snapshot of them, you will find that their spin axes will be at different orientations, but on average, they will be pointing to the local gravitational field of Earth.
"We expect each of the KBOs' orbital tilt angle to be at a different orientation, but on average, they will be pointing perpendicular to the plane determined by the sun and the big planets."
If one were to think of the average orbital plane of objects in the outer solar system as a sheet, it should be quite flat past 50 AU, according to Volk.
"But going further out from 50 to 80 AU, we found that the average plane actually warps away from the invariable plane," she explains. "There is a range of uncertainties for the measured warp, but there is not more than 1 or 2 percent chance that this warp is merely a statistical fluke of the limited observational sample of KBOs."In other words, the effect is most likely a real signal rather than a statistical fluke.
According to the calculations, an object with the mass of Mars orbiting roughly 60 AU (8,975,872,242) from the sun on an orbit tilted by about eight degrees (to the average plane of the known planets) has sufficient gravitational influence to warp the orbital plane of the distant KBOs within about 10 AU to either side.
"The observed distant KBOs are concentrated in a ring about 30 AU wide and would feel the gravity of such a planetary mass object over time," Volk said, "so hypothesizing one planetary mass to cause the observed warp is not unreasonable across that distance."
This rules out the possibility that the postulated object in this case could be the hypothetical Planet Nine, whose existence has been suggested based on other observations. That planet is predicted to be much more massive (about 10 Earth masses) and much farther out at 500 to 700 AU.
"That is too far away to influence these KBOs," Volk said. "It certainly has to be much closer than 100 AU to substantially affect the KBOs in that range."
Because a planet, by definition, has to have cleared its orbit of minor planets such as KBOs, the authors refer to the hypothetical mass as a planetary mass object. The data also do not rule out the possibility that the warp could result from more than one planetary mass object.
So why haven't we found it yet? Most likely, according to Malhotra and Volk, because we haven't yet searched the entire sky for distant solar system objects. The most likely place a planetary mass object could be hiding would be in the galactic plane, an area so densely packed with stars that solar system surveys tend to avoid it.
"The chance that we have not found such an object of the right brightness and distance simply because of the limitations of the surveys is estimated to be to about 30 percent," Volk said.
A possible alternative to an unseen object that could have ruffled the plane of outer Kuiper Belt objects could be a star that buzzed the solar system in recent (by astronomical standards) history, the authors said.
"A passing star would draw all the 'spinning tops' in one direction," Malhotra said. “Once the star is gone, all the KBOs will go back to precessing around their previous plane. That would have required an extremely close passage at about 100 AU, and the warp would be erased within 10 million years, so we don't consider this a likely scenario."
Humankind's chance to catch a glimpse of the mysterious object might come fairly soon once construction of the Large Synoptic Survey Telescope is completed. Run by a consortium that includes the UA and scheduled for first light in 2020, the instrument will take unprecedented, real-time surveys of the sky, night after night.
"We expect LSST to bring the number of observed KBOs from currently about 2000 to 40,000," Malhotra said. "There are a lot more KBOs out there — we just have not seen them yet. Some of them are too far and dim even for LSST to spot, but because the telescope will cover the sky much more comprehensively than current surveys, it should be able to detect this object, if it's out there."
Friday, 7 July 2017
The idea is known as entanglement or, as Einstein scathingly described it, “spooky action at a distance”. This is when particles in a quantum system influence each other, even over impossibly large distances. This phenomenon has been confirmed by several experiments, checking the validity of the Bell test, but the new research argues something different might be at play.
They call it “retrocausality”. The set-up of an experiment will influence the particles in the past, so what appears to be an action at a distance is, in reality, something the observer made happen by selecting a certain experiment. Two physicists have now put some math behind this hypothesis. Their work is published in Proceedings of The Royal Society A.
"There is a small group of physicists and philosophers that think this idea is worth pursuing, including Huw Price and Ken Wharton [a physics professor at San José State University]," lead author Matthew Leifer, from Chapman University, told Phys.org. "There is not, to my knowledge, a generally agreed upon interpretation of quantum theory that recovers the whole theory and exploits this idea. It is more of an idea for an interpretation at the moment, so I think that other physicists are rightly sceptical, and the onus is on us to flesh out the idea."
This is a good approach to have. Just because the idea is very out there doesn’t mean we won’t learn something by exploring it. Since it challenges some strong tenants of quantum physics, the proposed solutions might hint at some unexpected physics even if it's not related to retrocausality.
The model assumes that quantum theory is perfectly symmetric in time, so that the laws of quantum mechanics need to look the same whether you’re watching them forward or backward. The standard analogy for this is like an egg moving across a kitchen counter. If the egg suddenly dropped on the floor and broke, you’d know you were watching it forward. We don’t see eggs jumping back together, do we?
But in the quantum world, we don’t have an arrow of time, and if we assume that things must be symmetric, then retro-causality naturally arises, the researchers argue. That allows you to get rid of entanglement and a lot of constraints placed by the Bell test.
While this is early days, it will be interesting to see if this idea can be developed in ways that can be tested against the current model.
On the scale of galaxies, gravity appears to be stronger than we can account for using only particles that are able to emit light. So we add dark matter particles as 25% of the mass-energy of the Universe. Such particles have never been directly detected.
On the larger scales on which the Universe is expanding, gravity appears weaker than expected in a universe containing only particles – whether ordinary or dark matter. So we add “dark energy”: a weak anti-gravity force that acts independently of matter.
Brief history of “dark energy”
The idea of dark energy is as old as general relativity itself. Albert Einstein included it when he first applied relativity to cosmology exactly 100 years ago.
Einstein mistakenly wanted to exactly balance the self attraction of matter by anti-gravity on the largest scales. He could not imagine that the Universe had a beginning and did not want it to change in time.
Almost nothing was known about the Universe in 1917. The very idea that galaxies were objects at vast distances was debated.
Einstein faced a dilemma. The physical essence of his theory, as summarised decades later in the introduction of a famous textbook is:
Matter tells space how to curve, and space tells matter how to move.
That means space naturally wants to expand or contract, bending together with the matter. It never stands still.
This was realised by Alexander Friedmann who in 1922 kept the same ingredients as Einstein. But he did not try to balance the amount of matter and dark energy. That suggested a model in which universes that could expand or contract.
Further, the expansion would always slow down if only matter was present. But it could speed up if anti-gravitating dark energy was included.
Since the late 1990s many independent observations have seemed to demand such accelerating expansion, in a Universe with 70% dark energy. But this conclusion is based on the old model of expansion that has not changed since the 1920s.
Standard cosmological model
Einstein’s equations are fiendishly difficult. And not simply because there are more of them than in Isaac Newton’s theory of gravity.
Unfortunately, Einstein left some basic questions unanswered. These include – on what scales does matter tell space how to curve? What is the largest object that moves as an individual particle in response? And what is the correct picture on other scales?
These issues are conveniently avoided by the 100-year old approximation — introduced by Einstein and Friedmann — that, on average, the Universe expands uniformly. Just as if all cosmic structures could be put through a blender to make a featureless soup.
This homogenising approximation was justified early in cosmic history. We know from the cosmic microwave background — the relic radiation of the Big Bang — that variations in matter density were tiny when the Universe was less than a million years old.
But the universe is not homogeneous today. Gravitational instability led to the growth of stars, galaxies, clusters of galaxies, and eventually a vast “cosmic web”, dominated in volume by voids surrounded by sheets of galaxies and threaded by wispy filaments.
In standard cosmology, we assume a background expanding as if there were no cosmic structures. We then do computer simulations using only Newton’s 330-year old theory. This produces a structure resembling the observed cosmic web in a reasonably compelling fashion. But it requires including dark energy and dark matter as ingredients.
Even after inventing 95% of the energy density of the universe to make things work, the model itself still faces problems that range from tensions to anomalies.
Further, standard cosmology also fixes the curvature of space to be uniform everywhere, and decoupled from matter. But that’s at odds with Einstein’s basic idea that matter tells space how to curve.
We are not using all of general relativity! The standard model is better summarised as: Friedmann tells space how to curve, and Newton tells matter how to move.
Since the early 2000s, some cosmologists have been exploring the idea that while Einstein’s equations link matter and curvature on small scales, their large-scale average might give rise to back reaction – average expansion that’s not exactly homogeneous.
Matter and curvature distributions start out near uniform when the universe is young. But as the cosmic web emerges and becomes more complex, the variations of small-scale curvature grow large and average expansion can differ from that of standard cosmology.
Recent numerical results of a team in Budapest and Hawaii that claim to dispense with dark energy used standard Newtonian simulations. But they evolved their code forward in time by a non-standard method to model the back reaction effect.
Intriguingly, the resulting expansion law fit to Planck satellite data tracks very close to that of a ten-year-old general relativity-based back reaction model, known as the times cape cosmology. It posits that we have to calibrate clocks and rulers differently when considering variations of curvature between galaxies and voids. For one thing, this means that the Universe no longer has a single age.
In the next decade, experiments such as the Euclid satellite and the CODEX experiment, will have the power to test whether cosmic expansion follows the homogeneous law of Friedmann, or an alternative back reaction model.
To be prepared, it’s important that we don’t put all our eggs in one cosmological basket, as Avi Loeb, Chair of Astronomy at Harvard, has recently warned. In Loeb’s words:
To avoid stagnation and nurture a vibrant scientific culture, a research frontier should always maintain at least two ways of interpreting data so that new experiments will aim to select the correct one. A healthy dialogue between different points of view should be fostered through conferences that discuss conceptual issues and not just experimental results and phenomenology, as often is the case currently.
What can general relativity teach us?
While most researchers accept that the back reaction effects exist, the real debate is about whether this can lead to more than a 1% or 2% difference from the mass-energy budget of standard cosmology.
Any back reaction solution that eliminates dark energy must explain why the law of average expansion appears so uniform despite the inhomogeneity of the cosmic web, something standard cosmology assumes without explanation.
Since Einstein’s equations can in principle make space expand in extremely complicated ways, some simplifying principle is required for their large-scale average. This is the approach of the times cape cosmology.
Any simplifying principle for cosmological averages is likely to have its origins in the very early Universe, given it was much simpler than the Universe today. For the past 38 years, inflationary universe models have been invoked to explain the simplicity of the early Universe.
While successful in some aspects, many models of inflation are now ruled out by Planck satellite data. Those that survive give tantalising hints of deeper physical principles.
Many physicists still view the Universe as a fixed continuum that comes into existence independently of the matter fields that live in it. But, in the spirit of relativity – that space and time only have meaning when they are relational – we may need to rethink basic ideas.
Since time itself is only measured by particles with a non-zero rest mass, maybe space-time as we know it only emerges as the first massive particles condense.
Whatever the final theory, it will likely embody the key innovation of general relativity, namely the dynamical coupling of matter and geometry, at the quantum level.
Thursday, 6 July 2017
The origin of the Universe occurred 13.8 billion years ago in an event astronomers call the Big Bang. Our universe then sprang into existence through a phase of rapid inflation. It then took 9.3 billion years for the solar system to form along with life and humanity to evolve to the point it is today. It has only been in the last decade that we have been able to search and discover extra solar planets around nearby stars in our own Milky Way galaxy. as of 1 July 2017, there have been 3,621 exoplanets, in 2,712 planetary systems and 611 multiple planetary systems, confirmed detections. This seems a large amount, however in terms of the size of our expanding universe it is miniscule. There are a lot of parameters in play just for life to have a chance of creation and evolving. Anyone who has read Charles Dwain’s Origin of the Species will know that the steps that follow are delicate, and there are many branches of different kinds of life before one is successful and leads humanity. Our own solar system formed 4.5 billion years, and life first took hold on the earth 3.8 billion years ago. It has taken this long for humanity to become the dominant species with a reasonable level of intelligence.
The first galaxies came into existence 200 million years after the big bang, however, this does not mean that the conditions were favourable for life begin on a planet. Our own sun is a third generation star, which means that two stars existed in its place following the big bang, each of which cooked chemical elements in there interiors and exploded as supernova, spreading their contents around as a nebula. It was out of this proto-planetary material our sun and solar system evolved. While we now know that there was more iron in this baby universe, the first stars only contained hydrogen and lithium, which was not sufficient for life to start, there needed to be first, and or second generation stars before hand to cook the heavier elements in their cores.
The question I get asked rather a lot, is: ‘Do you think there is intelligent life in the universe?
There was not sufficient elements around after the first stars died for life to form. Second generation stars are needed to cook the elements further, so my answer is that there is not enough time the supper intelligent beings to exist in the universe today.
I said earlier that the laws of physics are not the same everywhere in the universe, so it is possible that life could have formed earlier on the first planets orbiting distant stars. Here again the parameters have to be just right, with the planet orbiting its parent star inside its habitable zone, and the star being similar to our sun and stable in nature. If the spark of life did happen on a planet with a second generation star, it’s my opinion that the inhabitants would not have evolved more than half way along a Type 1 civilisation on the Kardashev Scale.
The Kardashev Scale was originally designed in 1964 by the Russian astrophysicist Nikolai Kardashev ((born 1932 in Moscow). It has three base classes, each with an energy disposal level: Type I (10¹⁶W), Type II (10²⁶W), and Type III (10³⁶W).
The human race is not on this scale yet. Since we still sustain our energy needs from dead plants and animals, here on Earth, we are a lowly Type 0 civilization and we have a about 100 – 200 years to go before being promoted to a type I civilization.
A Type I designation is a given to species who have been able to harness all the energy that is available from a neighbouring star, gathering and storing it to meet the energy demands of a growing population. This means that we would need to boost our current energy production over 100,000 times to reach this status. However, being able to harness all Earth’s energy would also mean that we could have control over all natural forces. Human beings could control volcanoes, the weather, and even earthquakes! These kinds of feats are hard to believe, but compared to the advances that may still be to come, these are just basic and primitive levels of control.
A Type II civilization can harness the power of their entire star (not merely transforming starlight into energy, but controlling the star). Several methods for this have been proposed. The most popular of which is the hypothetical ‘Dyson Sphere.’ This device would encompass every single inch of the star, gathering most (if not all) of its energy output and transferring it to a planet for later use. Alternatively, if fusion power had been mastered by the race, a reactor on a truly immense scale could be used to satisfy their needs. Nearby gas giants can be utilized for their hydrogen, slowly drained of life by an orbiting reactor.
What would this much energy mean for a species? Well, nothing known to science could wipe out a Type II civilization. Take, for instance, if humans survived long enough to reach this status, and a moon sized object entered our solar system on a collision course with our little blue planet–we’d have the ability to vaporize it out of existence. Or if we had time, we could move our planet out of the way, completely dodging it. But let’s say we didn’t want to move Earth… are there any other options? Well yes, because we’d have the capability to move Jupiter, or another planet of our choice, into the way.
A Type III civilisation is where a species then becomes galactic travellers with knowledge of everything having to do with energy, resulting in them becoming a master race. In terms of humans, hundreds of thousands of years of evolution – both biological and mechanical – may result in the inhabitants of this type III civilization being incredibly different from the human race as we know it. These may be cyborgs (or cybernetic organism, beings both biological and robotic), with the descendants of regular humans being a sub-species among the now-highly advanced society. These wholly biological humans would likely be seen as being disabled, inferior, or un-evolved by their cybernetic counterparts.