Search This Blog

Showing posts with label News. Show all posts
Showing posts with label News. Show all posts

Saturday, September 13, 2025

'New' island emerges in Alaska

 NASA satellite imagery shows 'New' island emerges from melting ice in Alaska

Global warming is having effects all around the world. In some places, it’s transforming once-lush environments into barren deserts; in others, it’s doing essentially the opposite, albeit through an irony-fuelled, monkey’s paw kind of interpretation of “re-green the Earth’s deserts”. NASA's Earth Observatory has announced that Alaska has a "brand new island" after a retreating glacier lost contact with the Prow Knob mountain landmass in Alsek Lake. Alaska is turning into "a new lake district" thanks to melting glaciers. A 'new' island has appeared in the middle of a lake in southeastern Alaska after the landmass lost contact with a melting glacier, NASA satellite images reveal.

A new set of photos from NASA has laid bare a striking example of the latter effect, and it’s right on our doorstep. Alaska, it turns out, has a new island, not thanks to new land rising up through the seas, but because of glacial melt so dramatic that it has surrounded a piece of land previously connected to the mainland. “Along the coastal plain of southeastern Alaska, water is rapidly replacing ice,” NASA announced in an Earth Observatory Image of the Day post. “Glaciers in this area are thinning and retreating, with meltwater forming proglacial lakes off their fronts. In one of these growing watery expanses, a new island has emerged.” The landmass, named Prow Knob, is a small mountain which was formerly surrounded by the Alsek Glacier in Glacier Bay National Park. However, Alsek Glacier has been retreating for decades, slowly separating itself from Prow Knob and leaving a growing freshwater lake in its wake.

“The Alsek Glacier once encircled a small mountain known as Prow Knob near its terminus,” NASA explained. “In summer 2025, the glacier lost contact with Prow Knob, leaving the approximately 2-square-mile (5-square-kilometer) landmass surrounded by the water of Alsek Lake.” A recent satellite image, taken by Landsat 9 in August, reveals that the glacier has now lost all connection to Prow Knob, according to a statement released by NASA's Earth Observatory. Prow Knob provides a clear visual example of how glaciers are thinning and retreating in southeastern Alaska. "Along the coastal plain of southeastern Alaska, water is rapidly replacing ice," Lindsey Doermann, a science writer at the NASA Earth Observatory, wrote. "Glaciers in this area are thinning and retreating, with meltwater forming proglacial lakes off their fronts. In one of these growing watery expanses, a new island has emerged." Alsek Glacier used to split into two channels to wind its way around Prow Knob, which has a landmass of about 2 square miles (5 square km's). In the early 20th century, the glacier extended across the now-exposed Alsek Lake and as far as Gateway Knob, about 3 miles (5 km's) west of Prow Knob.

The melting of the Alsek Glacier happened slowly, until it didn’t. In 1894, the earliest observations of the glacier on record, the ice basically covered what’s now Alsek Lake; reports from 1907 described it as being “anchored to a nunatak,” according to a US Geological Survey review from 2005, the term refers to a rocky island surrounded by flowing glacier ice rather than water, with an iceface “as much as 50 m[eters, 164 feet] high.” The late glaciologist Austin Post, who captured aerial photographs of Alsek in 1960, named Prow Knob after its resemblance to the prow (pointed front end) of a ship. Post and fellow glaciologist Mauri Pelto, a professor of environmental science at Nichols College in Massachusetts, previously predicted that Alsek Glacier would release Prow Knob in 2020, based on the rate it was retreating between 1960 and 1990. The glacier has therefore clung on to its mountain for slightly longer than initially predicted.

But “by 1948, the glacier had retreated 1.5 to 2.5 km,” the survey continues. “By 25 August 1960, retreat was as much as 5 km.” By the late 1970s, the glacier, now much reduced in size and reach, seemed like it was about to separate into two ice tongues; two decades later, it had done precisely that, and “by 2003, the terminus separated into three distinct ice tongues,” the authors writes. Prow Knob completely separated from Alsek Glacier between July 13 and Aug 6. Many of Earth's glaciers are retreating as the planet gets warmer due to climate change. Last year was the hottest year for global average temperatures since records began, while 2025 has been marked by a string of record-breaking and near-record-breaking hot months. All of these changes can be seen in the images published by NASA, but it’s the most recent couple, from 2018 and 2025, which are most striking. In the space of just seven years, a short enough time that you probably think of yourself as not having aged at all throughout it, Alsek Lake has grown from abutting most of Prow Knob to entirely surrounding it.

At the same time, the ice has retreated, melting into the lake and calving away from itself as soaring temperatures make it warmer and less stable. It’s a trend that’s likely to continue, NASA warns, and one that’s becoming all too familiar in the once-icy Arctic state. Along with the Yakutat and Grand Plateau Glaciers, the melting of Alsek has resulted in lakes almost double the sizes they once were, not in times long past, but within living memory. “The lakes that are forming in this region are immense,” Mauri Pelto, a glaciologist at Nichols College, told NASA in November last year. “Starting at the mountains and spreading toward the coast,” the waters coming from these melting glaciers likely represent the fastest lake growth in the US this century, he believes. Alaska now is “a new lake district,” he said. “[One] that is unique in our nation.”

Wednesday, September 10, 2025

The Atlantic ocean is changing to brown

Something Huge and Brown is taking over the Atlantic Ocean : Record 37.5 Million tons of toxic seaweed suffocates Caribbean beaches 

Since 2011, a monstrous structure has taken shape in the Atlantic Ocean almost every year, sprawling from the West African coast to the Gulf of Mexico. It’s the Great Atlantic Sargassum Belt, a gargantuan bloom of a brown free-floating seaweed. In May, the seaweed belt hit a record biomass of 37.5 million tons. In a study, researchers from Florida Atlantic University’s (FAU) Harbour Branch Oceanographic Institute outline the rapidly growing seaweed’s development during the last four decades. Unsurprisingly, human activity is involved in a widespread ecological change. A vast and perplexing brown tide is sweeping across the Atlantic Ocean, alarming scientists as it disrupts ecosystems and threatens coastal communities from Africa to the Americas. Some of the important points are as follows:-

 A massive brown tide known as the Great Atlantic Sargassum Belt is spreading across the Atlantic Ocean.

The phenomenon highlights the interconnectedness of global ecosystems and the impact of human actions.

Coastal communities face economic and health risks as the seaweed clogs beaches and releases harmful gases.

Scientists link the growth to human activities which introduce excessive nutrients into the ocean.

A peculiar ecological phenomenon is sweeping across the Atlantic Ocean, drawing the attention of scientists and policymakers alike. This vast expanse of floating seaweed, known as the Great Atlantic Sargassum Belt, is not merely a natural curiosity but a potent indicator of the profound ways human activities are reshaping marine environments. The bloom, which now stretches from West Africa to the Gulf of Mexico, has reached unprecedented levels, posing significant challenges to coastal communities and ecosystems. As researchers strive to understand this phenomenon, the implications for our oceans, and the people who rely on them, are becoming increasingly urgent. The influx of nutrients from major rivers, including the Mississippi and the Amazon, acts as a catalyst for this growth. Researchers have identified these rivers as key drivers of the bloom’s expansion, providing the necessary nutrients which allow sargassum to thrive. The scale of this bloom is unprecedented, with the biomass reaching a record 37.5 million tons recorded recently. This massive accumulation of seaweed is reshaping entire ocean basins and challenging our understanding of marine ecosystems.

The Great Atlantic Sargassum Belt has been expanding dramatically, transforming from a localized phenomenon into a massive oceanic bloom. This belt of floating sargassum has spread from its traditional habitat in the Sargasso Sea to encompass a vast swath of the Atlantic Ocean. Ocean currents like the Loop Current and the Gulf Stream play a crucial role in this expansion, distributing nutrient-rich waters which fuel the seaweed’s growth. Satellite imagery has captured the rapid increase in sargassum biomass, doubling in just days under optimal conditions. “The expansion of sargassum isn’t just an ecological curiosity, it has real impacts on coastal communities. The massive blooms can clog beaches, affect fisheries and tourism, and pose health risks,” Brian Lapointe, lead author of the study and a marine scientist at FAU Harbor Branch, said. “Understanding why sargassum is growing so much is crucial for managing these impacts,” he added. “Our review helps to connect the dots between land-based nutrient pollution, ocean circulation, and the unprecedented expansion of sargassum across an entire ocean basin.”

The surge in sargassum biomass can be traced back to human activities which introduce excessive nutrients into the ocean. According to Brian Lapointe, land-based nutrient inputs are the primary drivers of this growth. Agricultural runoff, wastewater discharge and atmospheric deposition contribute to the nutrient-rich conditions which favour sargassum blooms. The chemical composition of sargassum has changed over the years, with nitrogen levels increasing significantly while phosphorus has declined. This shift indicates the profound impact of terrestrial processes on marine ecosystems. By altering the nutrient balance in the ocean, human activities are reshaping the growth patterns of marine species, with sargassum being a prime example. The seaweed’s ability to thrive in nutrient-poor waters by recycling marine waste further complicates management efforts and underscores the interconnectedness of terrestrial and marine ecosystems. Scientists previously believed that sargassum was mostly limited to the Sargasso Sea’s nutrient-poor waters. More recent research, however, has revealed the organism to be quite the traveller, tracing sargassum’s movement from nutrient-rich coastal areas, such as the western Gulf of Mexico, to the open ocean, hitching a ride on the Loop Current (one of the fastest currents in the Atlantic) and the Gulf Stream. In the open ocean, nutrients are usually concentrated at great depth.

The spread of the Great Atlantic Sargassum Belt has far-reaching consequences for coastal communities. The dense mats of seaweed can clog beaches, disrupt fisheries and pose health risks to local populations. Popular tourist destinations in the Caribbean, Mexico and Florida have experienced significant economic losses due to emergency clean-ups and decreased tourism revenue. Sargassum blooms also create oxygen-depleted zones beneath the dense mats, affecting marine life and fisheries. The decomposing seaweed releases hydrogen sulfide gas, which can cause respiratory problems for nearby residents. In extreme cases, such as the 1991 shutdown of a Florida nuclear power plant, the impacts of these blooms have disrupted critical infrastructure. As the belt continues to expand, these disruptions are likely to become more frequent, posing on going challenges to coastal economies and public health. In 2004 and 2005, satellite imagery revealed massive sargassum windrows, long bands of floating sargassum, in the western Gulf of Mexico, a region where rivers, including the Mississippi and Atchafalaya, are increasingly dumping nutrients. 

Understanding the dynamics of sargassum growth requires an examination of its nutrient composition over time. Researchers have studied the changes in nitrogen, phosphorus, and carbon levels across different regions of the Atlantic to identify the environmental forces driving this phenomenon. Factors such as river flows, rainfall and Amazon basin floods play a significant role in influencing the bloom’s biomass. In fact, research since the 1980s revealed that the seaweed grows faster and is more productive in shallow nutrient-rich waters than nutrient-poor open ocean waters. In other words, more nutrients mean more sargassum. In certain conditions, the biomass of Sargassum natans and Sargassum fluitans can increase twofold within few days. By analysing the nutrient composition of sargassum, scientists are gaining insights into the complex interactions between terrestrial and marine ecosystems. The seaweed’s ability to adapt to varying nutrient levels and recycle marine waste highlights its resilience and complicates management strategies. As researchers continue to investigate the factors driving sargassum growth, the findings hold important implications for understanding how human activities influence marine environments on a global scale.

Phosphorus and nitrogen are crucial nutrients for sargassum. From the 1980s to the 2020s, while the seaweed’s nitrogen content rose by over 50%, its phosphorus declined. “These changes reflect a shift away from natural oceanic nutrient sources like upwelling and vertical mixing, and toward land-based inputs such as agricultural runoff, wastewater discharge and atmospheric deposition,” Lapointe explained. In other words, human activity. Carbon levels in sargassum are creeping upwards, demonstrating how outside nutrients are changing its makeup and affecting ocean plant life, he added. The team also highlights, however, that sargassum windrows are able to also grow in nutrient-poor waters by recycling nutrients in marine animal poop, among other methods. The Great Atlantic Sargassum Belt serves as a stark reminder of the interconnectedness of global ecosystems. The bloom’s expansion reflects how human activities, such as nutrient pollution from agriculture and urban development, can have far-reaching impacts on marine environments. As scientists work to unravel the complexities of this ecological phenomenon, the broader implications for ocean health and coastal communities remain a pressing concern. 

“Our review takes a deep dive into the changing story of sargassum, how it’s growing, what’s fuelling that growth, and why we’re seeing such a dramatic increase in biomass across the North Atlantic,” Lapointe explained. “By examining shifts in its nutrient composition, particularly nitrogen, phosphorus and carbon, and how those elements vary over time and space, we’re beginning to understand the larger environmental forces at play.” The study is just one more example of how human activity is driving deeply rooted ecological changes, with the extent of its farthest-reaching consequences still terrifyingly unknown to the world around us.

Tuesday, September 9, 2025

Secret fresh water under the ocean

 Scientists tap "secret fresh water" known to exist in shallow salt waters

Deep in Earth’s past, an icy landscape became a seascape as the ice melted and the oceans rose off what is now the northeastern US. Nearly 50 years ago, a US government ship searching for minerals and hydrocarbons in the area drilled into the seafloor to see what it could find. It found, of all things, drops to drink under the briny deeps, fresh water. This summer, a first-of-its-kind global research expedition followed up on that surprise. Drilling for fresh water under the salt water off Cape Cod extracted thousands of samples from what is now thought to be a massive, hidden aquifer stretching from New Jersey as far north as Maine. It's just one of many depositories of "secret fresh water" known to exist in shallow salt waters around the world which might some day be tapped to slake the planet's intensifying thirst, said Brandon Dugan, the expedition's co-chief scientist. "We need to look for every possibility we have to find more water for society," Dugan, a geophysicist and hydrologist at the Colorado School of Mines, said who recently spent 12 hours on the drilling platform. The research teams looked in "one of the last places you would probably look for fresh water on Earth." They found it, and will be analysing nearly 50,000 litres (13,209 gallons) of it back in their labs around the world in the coming months. They're out to solve the mystery of its origins, whether the water is from glaciers, connected groundwater systems on land or some combination. The potential is enormous. So are the hurdles of getting the water out and puzzling over who owns it, who uses it and how to extract it without undue harm to nature. It's bound to take years to bring that water ashore for public use in a big way, if it's even feasible.

The work at sea unfolded over three months from Liftboat Robert, an oceangoing vessel that, once on site, lowers three enormous pillars to the seafloor and squats above the waves. Normally it services offshore petroleum sites and wind farms. But this drill mission was different. "It's known that this phenomena exists both here and elsewhere around the world," Expedition 501 project manager Jez Everest, a scientist who came from the British Geological Survey in Edinburgh, Scotland, said of undersea water. "But it's a subject that's never been directly investigated by any research project in the past." By that, he means no one globally had drilled systematically into the seabed on a mission to find freshwater. Expedition 501 was quite literally ground breaking, it penetrated Earth below the sea by as many as 1,289 feet or nearly 400 meters. But it followed a 2015 research project which mapped contours of an aquifer remotely, using electromagnetic technology and roughly estimated salinity of the water underneath. Woods Hole Oceanographic Institution and Lamont-Doherty Earth Observatory at Columbia University, reported evidence of a "massive offshore aquifer system" in this area, possibly rivalling the size of America's largest, the Ogallala aquifer, which supplies water to parts of eight Great Plains states. Two developments in 1976 had stirred interest in searching for undersea freshwater. In the middle of Nantucket island, the US Geological Survey drilled a test well to see how far down the groundwater went. It extracted fresh water from such great depths which made scientists wonder if the water came from the sea, not the sky.

The federal agency mounted a 60-day expedition aboard the drilling vessel Glomar Conception along a vast stretch of the Continental Shelf from Georgia to Georges Bank off New England. It drilled cores in search of the sub-seabed's resources, like methane. It found an eye-opening amount of fresh or freshened water in borehole after borehole. That set the stage for the water-seekers to do their work a half-century later. Soon after Robert arrived at the first of three drilling sites, samples drawn from below the seabed registered salinity of just 4 parts per thousand. That's far below the oceans' average salt content of 35 parts per thousand but still too briny to meet the US freshwater standard of under 1 part per thousand. "Four parts per thousand was a eureka moment," Dugan said, because the finding suggested that the water must have been connected to a terrestrial system in the past, or still is. As the weeks wore on and Robert moved from site to site 20 to 30 miles (30 to 50 km's) off the coast, the process of drilling into the waterlogged subsea sediment yielded a collection of samples down to 1 part per thousand salt content. Some were even lower.

In months of analysis ahead, the scientists will investigate a range of properties of the water, including what microbes were living in the depths, what they used for nutrients and energy sources and what by products they might generate; in other words, whether the water is safe to consume or otherwise use. "This is a new environment that has never been studied before," said Jocelyne DiRuggiero, a Johns Hopkins University biologist in Baltimore who studies the microbial ecology of extreme environments and is not involved in the expedition. "The water may contain minerals detrimental to human health since it percolated through layers of sediments," she said. "However, a similar process forms the terrestrial aquifers that we use for freshwater, and those typically have very high quality." By sequencing DNA extracted from their samples, she said, the researchers can determine which microorganisms are there and "learn how they potentially make a living."

In just five years, the UN says, the global demand for fresh water will exceed supplies by 40%. Rising sea levels from the warming climate are souring coastal freshwater sources while data centres which power AI and cloud computing are consuming water at an insatiable rate. The fabled Ancient Mariner's lament, "Water, water, every where, nor any drop to drink," looms as a warning to landlubbers as well as to sailors on salty seas. Cape Town, South Africa, came perilously close to running out of fresh water for its nearly 5 million people in 2018 during an epic, three-year drought. South Africa is thought to have a coastal undersea freshwater bonanza, too, and there is at least anecdotal evidence that every continent may have the same. In Virginia alone, a quarter of all power produced in the state goes to data centres, a share expected to nearly double in five years. By some estimates, each midsize data centre consumes as much water as 1,000 households. Each of the Great Lakes states has experienced groundwater shortages. Canada's Prince Edward Island, Hawaii and Jakarta, Indonesia are among places where stressed freshwater supplies coexist with prospective aquifers under the ocean. Try Expedition, a $25 million scientific collaboration of more than a dozen countries backed by the US government's National Science Foundation and the European Consortium for Ocean Research Drilling. Scientists went into the project believing the undersea aquifer they were sampling might be sufficient to meet the needs of a metropolis the size of New York City for 800 years. They found fresh or nearly fresh water at both higher and lower depths below the seafloor than they anticipated, suggesting a larger supply even than that.

Techniques will also be used to determine whether it came from glacial ice melt thousands of years ago or is still coming via labyrinthian geologic formations from land. Researchers will date the water back in the lab, and that will be key in determining whether it is a renewable resource which could be used responsibly. Primordial water is trapped and finite; newer water suggests the aquifer is still connected to a terrestrial source and being refreshed, however slowly. "Younger means it was a raindrop 100 years ago, 200 years ago," Dugan said. "If young, it's recharging." Those questions are for basic science. For society, all sorts of complex questions arise if the basic science affirms the conditions necessary for exploiting the water. Who will manage it? Can it be taken without an unacceptable risk of contaminating the supply from the ocean above? Will it be cheaper or environmentally friendlier than today's energy-hungry desalination plants? Dugan said if governments decide to get the water, local communities could turn to the aquifers in time of need, such as drought, or when extreme storms flood coastal freshwater reserves and ruin them. The notion of actually using this old buried water is so new that it has not been on the radar of many policymakers or conservationists.

"It's a lesson in how long it can take sometimes to make these things happen and the perseverance that's needed to get there," said Woods Hole geophysicist Rob Evans, whose 2015 expedition helped point the way for 501. "There's a ton of excitement that finally they've got samples." Still, he sees some red flags. One is that tapping undersea aquifers could draw water away from onshore reserves. Another is that undersea groundwater which seeps out to the seafloor may supply nutrients vital to the ecosystem, and that might not be right. "If we were to go out and start pumping these waters, there would almost certainly be unforeseen consequences," he said. "There's a lot of balance we would need to consider before we started diving in and drilling and exploiting these kinds of things." For most in the project, getting to and from Liftboat Robert meant a voyage of seven hours or more from Fall River, Massachusetts, on a supply boat that made round trips every 10 days or so to replenish stocks and rotate people.

On the platform, around the clock, the racket of metal bore pipes and machinery, the drilling grime and the speckled mud mingled with the quieter, cleaner work of scientists in trailers converted to pristine labs and processing posts. There, samples were treated according to the varying needs of the expedition's geologists, geochemists, hydrologists, microbiologists, sedimentologists and more. Passing through clear plastic tubes, muck was sliced into disks like hockey pucks. Machines squeezed water out. Some samples were kept sealed to enable study of ancient gases dissolved in the water. Other samples were frozen, filtered or left as is, depending on the purpose. After six months of lab analysis, all the science teams of Expedition 501 will meet again, this time in Germany for a month of collaborative research that is expected to produce initial findings that point to the age and origin of the water.

Monday, September 8, 2025

Ocean lifeline vanishes due to climate disruption

 Climate disruption effects Panama’s ocean lifeline for first time in 40 years

Scientists from the Smithsonian Tropical Research Institute warned that the upwelling, which makes the waters of the Gulf of Panama colder and richer in nutrients every summer, did not occur in 2025 for the first time in at least 40 years. Upwelling events in the Gulf of Panama primarily occur during Central America’s dry season (December to April) due to the northern trade winds. Every year, Panama’s Pacific coast benefits from powerful seasonal winds that drive nutrient-rich waters to the surface, sustaining fisheries and protecting coral reefs. But for the first time in at least four decades, this crucial upwelling did not occur. Scientists suspect weakened trade winds linked to climate disruption played a role, leaving cooler waters absent and fisheries under stress. Upwelling events support highly productive fisheries and help protect coral reefs from thermal stress. 

During the dry season in Central America (generally between December and April), northern trade winds generate upwelling events in the ocean waters of the Gulf of Panama. Upwelling is a process that allows cold, nutrient-rich waters from the depths of the ocean to rise to the surface. Thanks to this movement of water, the sea along Panama's Pacific beaches remains cooler during the "summer" vacation season. This dynamic supports highly productive fisheries and helps protect coral reefs from thermal stress. STRI scientists have studied this phenomenon, and their records indicate that the seasonal upwelling had been a constant and predictable feature of the Gulf for at least 40 years; however, in 2025, it “did not occur for the first time.” Consequently, the temperature decrease and increased productivity typical of this time of year were reduced. Scientists suggest that a significant reduction in wind patterns caused this unprecedented event.

Scientists from the Smithsonian Tropical Research Institute (STRI) says that this seasonal upwelling, which occurs from January to April, has been a consistent and predictable feature of the gulf. However, researchers recently recorded that in 2025, this vital oceanographic process did not occur for the first time. As a result, the typical drops in temperature and spikes in productivity during this time of year were diminished. Scientists suggest that a significant reduction in wind patterns was the cause of this unprecedented event. Still, further research is needed to determine a more precise cause and its potential consequences for fisheries. This situation reveals “how climate disruption can quickly alter fundamental oceanic processes that have sustained coastal fishing communities for thousands of years.” The STRI also argues that this finding highlights the growing vulnerability of tropical upwelling systems, which, despite their enormous ecological and socioeconomic importance, remain sparsely monitored.

This finding highlights the growing vulnerability of tropical upwelling systems, which, despite their enormous ecological and socioeconomic importance, remain poorly monitored. It also underscores the urgency of strengthening ocean-climate observation and prediction capabilities in the planet's tropical regions. This result marks one of the first major outcomes of the collaboration between the S/Y Eugen Seibold research vessel from the Max Planck Institute and STRI. The STRI, based in Panama, is a unit of the Smithsonian Institution which promotes understanding of tropical nature and its importance to human well-being. It also trains students to conduct tropical research and fosters conservation by raising public awareness of the beauty and importance of tropical ecosystems. 

Friday, September 5, 2025

New world record for internet speed by Japan

 Japan’s latest breakthrough is rewriting the rules of speed : 4 million times faster than the average speed in US

Japan has just crushed records with a new internet speed so fast, it’s almost hard to believe. Imagine streaming entire libraries, massive data collections or ultra-high-definition videos in mere minutes. This isn’t science fiction, it’s happening now thanks to a ground breaking achievement from Japanese researchers. A team in Japan set a new world record in fibre optics, reaching a data speed of 1.02 petabits per second over roughly 1,123 miles with a new kind of optical fibre. The achievement yielded a capacity–distance product of 1.86 exabits per second per mile. This rate is about 4 million times higher than the US median fixed broadband download speed of about 285 Mbps. Lead researcher Hideaki Furukawa of the National Institute of Information and Communications Technology (NICT) in Japan guided the transmission experiments and system work. They’ve developed an optical fibre system which can transmit over the equivalent of traveling from New York to Florida. To put this into perspective, this speed would open doors to a future where data moves at incredible rates.

The team in Japan smashed the previous world record of just over 50,000 gigabytes per second, doubling this accomplishment in a matter of months. This remarkable leap was made possible by creating a new form of optical fibre cable. Unlike conventional cables, this advanced fibre bundles 19 standard fibres into a tiny strand barely thicker than a single human hair, roughly five-thousandths of an inch in diameter. The cable fits 19 light paths inside a cladding that measures about 0.005 inches, the same size used by most existing lines. This design allows it to slot into current routes without changing the outside diameter. The cores share a single glass cladding and are engineered to behave the same way, so the light follows a uniform path through each core. This uniform behaviour reduces power swings and lowers loss in both the C band and L band, the primary wavelength ranges for long-distance links. The design also avoids the spacing penalties of uncoupled multicore layouts, where engineers minimize crosstalk by spacing cores farther apart. Less data loss means stronger signals and the ability to send information much farther without interruption. This optical fibre is specifically designed to optimize long-distance transmission, making it a game-changer for telecommunications infrastructure.

Interestingly, the design fits into existing cable installations since it matches the typical thickness of conventional single-fibre cables. This means upgrades won’t require costly, large-scale overhauls of the current network, a clever way to increase capacity while keeping costs and disruptions low. In a coupled layout, the system allows mixing between cores and later corrects it using digital processing at the receiver. Low fibre loss across wide wavelengths, combined with predictable coupling, made long range and high rate possible at the same time. Earlier projects achieved fast signals over much shorter spans, but this approach pushes capacity and reach together. A petabit equals one million gigabits, a unit that marks a leap beyond the gigabit tier common to residential plans. The capacity–distance product multiplies data rate by distance to compare systems which go fast, far or both. Before this breakthrough, the same research team had achieved similar speeds but only across a short span, less than one-third of the 1,120 miles covered this time. The major obstacles were finding ways to reduce data loss and boost signal strength enough to maintain quality over longer distances. Their latest system transmits data 21 times through the cable, ensuring it reaches the receiver after traveling over a thousand miles without significant degradation.

A multicore fibre places several cores inside one cladding so that many signals travel in parallel. MIMO is a digital filter which separates mixed signals from different cores or modes, allowing the original data streams to emerge cleanly. Long-haul optical links use the C band and L band as their main wavelength windows because standard amplifiers operate efficiently in those ranges. The 16-state Quadrature Amplitude Modulation (16QAM) method stores more information per symbol than simpler formats, raising data rates when noise and distortion are controlled. Looking back, it’s incredible how far we’ve come in such a short time. Just remember the frustration of dial-up internet, where waiting several minutes just to open a single photo was normal. Now, we’re talking about speeds which make those early experiences feel like ancient history. The team built 19 synchronized recirculating loops, each fed by one core of a 53.5-mile spool that included splitters, combiners, amplifiers and a control switch. A switch sent the signal around the loop 21 times before it reached a bank of receivers, producing the full end-to-end distance. They lit 180 wavelengths across the C and L bands and modulated each with 16QAM, a higher-order format which increases bits per symbol when conditions are clean enough. Multiple wavelengths across two bands gave the system a wide runway for total throughput. At the end, a coherent 19 channel receiver separated spatial channels while a MIMO engine untangled the mixed signals introduced by the coupled cores. Error correction code finished the job and produced the net payload figure used to report the result.

This progress is timely. With global data use expected to multiply rapidly in the coming years, the demand for new, scalable high-capacity communication systems is exploding. Japan’s advancement provides a promising roadmap to meet this demand, potentially transforming how governments, businesses and everyday users interact with data. So, what does this mean for you? Imagine streaming 8K videos or engaging in highly immersive virtual experiences without buffering or delays. Large-scale scientific research, cloud computing and even personal data backups could proceed almost instantly, reshaping what’s possible in almost every digital endeavour. Short bursts in a lab are one thing; dependable hauls between cities are another. Long spans expose loss, amplifier noise, nonlinear effects and chromatic dispersion which often remain hidden on short test beds. Engineers track progress in optical fibre systems with the capacity-distance product, which multiplies rate by distance to summarize both speed and reach in a single number. A higher product means a system can carry more bits for longer without running out of margin. This demonstration shows that dense spatial channels inside a standard-sized fibre, combined with broad wavelength use and shared amplification, can lift that product. It achieves this without changing the outside fibre size, a practical way to scale, since networks care about what fits in ducts, trays and connectors.

With data flowing from continent to continent at lightning-fast pace, the potential for innovation grows exponentially. Developers of the Internet of Things, augmented reality, and smart cities will benefit immensely from the existence of stable, ultra-fast networks. This breakthrough isn’t just about raw speed, it’s a foundation for a more connected and intelligent world. A key choice was keeping the cladding diameter at about 0.005 inches, which matches the size used by most installed fibre and the tools built around it. “For fibre fabrication and deployment, it is highly beneficial to use fibres with a standard cladding diameter,” said Menno van den Hout from the National Institute of Information and Communications Technology. Keeping dimensions and interfaces familiar lowers the barrier to field trials and later deployment if costs align. It also enables step-by-step rollouts, where multicore spans boost capacity on tough segments while other spans remain single-core. The idea of space-division multiplexing has been studied for more than a decade, and its value has been demonstrated across many experiments. “This Review summarizes the simultaneous transmission of several independent spatial channels of light along optical fibres to expand the data carrying capacity of optical communications,” said Benjamin Puttnam of the National Institute of Information and Communications Technology. This record from Japan illustrates the relentless human pursuit of pushing boundaries. Each technological leap sparks new opportunities and redefines the limits of what our devices and networks can do. It’s exciting to think about the possibilities this opens up, but also a reminder that innovation never stops around the world.

Saturday, August 30, 2025

Nuclear battery with 50-year lifespan and triple efficiency

Chinese Scientists have build nuclear battery with 50-year lifespan and triple efficiency

Researchers in China have developed a novel nuclear battery which can withstand at least half a century of radiation and deliver three times the energy efficiency of conventional designs. The team set out to improve battery performance in extreme environments, led by Haisheng San, PhD, a professor at Xiamen University, and Xin Li, PhD, a researcher at the China Institute of Atomic Energy. Chinese researchers have unveiled a ground breaking nuclear battery technology that promises to deliver threefold energy efficiency and withstand extreme environments for over 50 years, marking a significant leap forward in sustainable power solutions. The new radio-photovoltaic cells offer compact, long-term nuclear power. Following are the some of the important points:-

Chinese researchers have developed a novel nuclear battery with threefold energy efficiency.

Challenges include the cost and production of strontium-90 radioisotopes.

The technology uses strontium-90 radio-photovoltaic cells for long-term, reliable power.

This advancement could significantly impact global energy policies and sustainability efforts.

In recent years, the quest for sustainable and efficient energy solutions has become increasingly urgent. A team of Chinese researchers has made a ground breaking advancement in this field by developing a novel nuclear battery. According to the scientists, conventional power systems, especially those used in extreme conditions such as space or deep-sea infrastructure, struggle with long-term reliability. "Conventional power sources (e.g., chemical batteries, fuel cells and photovoltaic cells) fail to meet the stringent operational demands of harsh environments, including long-term durability, maintenance-free operation and continuous self-sustaining capabilities,” the researchers said. Their limited energy density, sensitivity to environmental factors, and the need for periodic maintenance make them impractical for missions which require continuous, unattended power over many years. Now, in a bid to address these challenges, the researchers developed strontium-90 radio-photovoltaic cells (RPVCs) built on a waveguide light concentration (WLC) structure. This innovative technology promises to deliver three times the energy efficiency of existing designs while withstanding extreme environmental conditions for over half a century. This development is particularly significant for applications in challenging environments, such as deep-sea exploration and space missions, where conventional power systems often struggle to perform reliably over extended periods.

The innovative design integrates multilayer-stacked GAGG: Ce (Cerium-doped gadolinium aluminium gallium garnet) scintillation waveguides with strontium-90 radioisotopes. Ce is a single-crystal scintillator known for its excellent photon detection capabilities. It is among the brightest available, with an emission peak at 520 nanometers (nm). The setup converts radioactive energy into light, which is then directed toward photovoltaic cells which generate electricity. In performance trials, a single RPVC unit achieved an energy conversion efficiency of 2.96 %, significantly higher than existing RPVC designs. The core of this advancement lies in the development of strontium-90 radio-photovoltaic cells (RPVCs) built on a waveguide light concentration (WLC) structure. These cells integrate multilayer-stacked GAGG: Ce scintillation waveguides with radioisotopes, enabling the conversion of radioactive energy into electricity. The use of Cerium-doped gadolinium aluminium gallium garnet (GAGG: Ce) is crucial, as it provides exceptional photon detection capabilities, making it one of the brightest scintillators available. In addition, the team reported an output of 48.9 microwatts (μW) from a single unit, with a multi-module version reaching 3.17 milliwatts (mW). The prototype also demonstrated a short-circuit current of 2.23 milliamperes (mA) and an open-circuit voltage of 2.14 volts (V). “We designed and fabricated an RPVC that achieves a balance between efficiency and stability,” the scientists said. The process involves converting radioactive energy into light, which is then directed toward photovoltaic cells that generate electricity. The efficiency achieved combined with the ability to generate up to 3.17 milliwatts in a multi-module setup, represents a significant step forward in nuclear battery technology.

Most notably, when the team simulated long-term use by exposing the RPVCs to electron beam irradiation equivalent to 50 years of radiation exposure, the devices showed only a modest 13.8 % drop in optical performance. One of the most remarkable aspects of this technology is its durability. The researchers subjected the RPVCs to electron beam irradiation equivalent to 50 years of radiation exposure, simulating long-term use. This resilience makes them highly suitable for applications where continuous, unattended power is crucial. The WLC-based RPVCs not only achieve high power output but also maintain outstanding long-term stability. The system minimizes energy loss by directing light from the scintillator directly into the photovoltaic cells, requiring no moving parts or external energy input. While challenges remain in terms of mass production and cost reduction of strontium-90 radioisotopes, the current research marks a substantial step forward in promoting nuclear battery applications.

Summarizing the advantages of the discovery, the team elaborated that WLC-based RPVCs can achieve both high power output and outstanding long-term stability, representing a substantial advancement in facilitating nuclear battery applications. “The WLC structure realizes a 3-fold improvement in energy conversion efficiency compared with conventional RPVC structures,” the researchers explained. Despite the promising results, the large-scale production of RPVCs is currently limited by several challenges. The cost and availability of strontium-90 radioisotopes pose significant hurdles which need to be addressed for widespread adoption. Additionally, the researchers acknowledge the need for advancements in mass production techniques to make this technology economically viable. Nonetheless, the potential applications of this technology are vast. From powering deep-sea exploration equipment to sustaining space missions, the RPVCs offer a reliable and efficient energy source for environments where conventional power systems falter. The researchers’ work has laid the foundation for further developments in nuclear battery technology, with the promise of even greater efficiencies and wider applications in the future.

“The irradiation equivalent to 50 years of service confirms that WLC-based RPVCs have great long-term service stability,” researchers added. The system minimizes energy loss by focusing light from the scintillator directly into the photovoltaic cells while requiring no moving parts or external energy input. The development of these advanced nuclear batteries also has significant global implications. As countries strive to meet increasing energy demands while reducing carbon footprints, innovations like the RPVCs offer a sustainable alternative. The ability to provide long-term, maintenance-free power solutions could revolutionize energy infrastructures, particularly in remote and challenging environments. Moreover, this advancement positions China as a leader in nuclear battery technology, potentially influencing global energy policies and research priorities. As the development of nuclear battery technology progresses, the potential for transforming energy infrastructures remains vast. How will these advancements influence global energy policies and the future landscape of sustainable power solutions? As the world grapples with energy challenges, collaborations and knowledge sharing across borders could accelerate the adoption and refinement of such technologies, benefiting the global community. “Although large-scale production of RPVCs is still limited by challenges such as mass production and cost reduction of strontium-90 radioisotopes, the current research results mark a substantial step forward in promoting nuclear battery applications,” the researchers concluded. 

Thursday, August 28, 2025

World’s first-Electric-flying-car

 World’s first-Electric-flying-car to start operations at Silicon Valley airports 

The first all-electric flying car is here, and it could land at an airport near you. Yes, flying cars are real. They are not just in the movies anymore. The first all-electric flying car is about to take flight after signing agreements with several airports. The car has been in the making for a decade. Now the world’s first electric flying car is testing at airports. Just a few years ago, not many thought this day would come. Alef Aeronautics has been developing its electric flying car since 2015, attracting major investors like Tim Draper, known for his early investment in Tesla. In 2022, Alef became an internet sensation after unveiling a new prototype, dubbed the Model A. The company claims the vehicle (or aircraft) can drive 220 miles and has a 110-mile flight range. 

San Mateo-based Alef has signed agreements with the Hollister and Half Moon Bay airports to conduct operations of the world’s first flying car, a road vehicle which can take off vertically. The company will begin test operations alongside other aircraft types. Less than a year later, the California-based startup became the first to receive a Special Airworthiness Certification from the US Federal Aviation Administration. Alef took it a step further, becoming the first company to receive pre-orders for an aircraft sold through a car dealership. Alef had also released a video earlier this year, giving its potential consumers a glimpse of the ‘Ultralight’ version of Model A jumping over another vehicle.

Now, the company is set to begin its test operations at the two Silicon Valley airports, Half Moon Bay and Hollister. It will test how the car works with other aircraft in air traffic. Both airports could also serve as a base for flying cars in the near future, according to the company. Planning to start with the Model Zero Ultralight, Alef plans to expand its product base with other Model Zero models and the commercial Model A. We got our first look at the all-electric flying car in action earlier this year after Alef released a video of an “ultralight version” of the Model A jumping over another vehicle. The company claimed it was the “first-ever video in history of a car driving and vertically taking off”. CEO Jim Dukhovny introduces the Model A electric flying car at the Detroit Auto Show. In yet another first, Alef announced it has now secured agreements to begin operations at two new Silicon Valley airports: Half Moon Bay and Hollister Airport. The flying cars will operate, both as a car and as an aircraft, alongside other types of aircraft, to assess their performance in common air traffic patterns. According to Alef’s website, the company has been working on building the flying car for almost a decade. The goal of the company is to develop its first consumer product, the Alef Model A.

Both airports could serve as a base for a future fleet of flying cars, according to Alef. It will start with the Model Zero Ultralight, but Alef plans to expand with other Model Zero models and the commercial Model A. Planned operations include driving, vertical takeoff, forward flight and vertical landing, as well as air and ground manoeuvring. The vehicle is also classified as ‘ultralight’, meaning the company doesn’t need to have any legal certifications to fly the car, according to the company. Alef pointed out that the classification brings certain restrictions for operators, such as limiting flights to daylight hours and prohibiting ultralight vehicles from flying over congested or densely populated areas like cities or towns. This is not what anyone thought flying cars would be when they were dreaming of them decades and decades ago. This is an unwieldy, overly expensive, hovering compromise on bicycle wheels. It’s not a real car and it’s not a true flying one. Alef says its flying car is “100% electric, drivable on public roads, and has vertical takeoff and landing capabilities.”

The flying car will be 100% electric, along with a driving range of 200 miles and a flight range of 110 miles. Thanks to its Model A, Alef created a buzz on social media in 2022 after unveiling its prototype. “On average, the Alef flying car uses less energy per trip than a Tesla or any other EV,” the company said. “Alef first and foremost is a car, using the automotive infrastructure, automotive business model and automotive market. The novelty is integrating a car into the aviation infrastructure and air traffic,” said Jim Dukhovny, CEO of Alef. “Working in safe, controlled, non-towered airport environments will help Alef, FAA, airport operators, and pilots see how this will work in the future at scale. Electric aviation is more environmentally friendly, quieter and requires less space, hence it is good to see Silicon Valley airports embracing electric aviation,” he continued.

Alef’s flying electric car can jump over another vehicle. The company has already signed supply agreements for industry-grade parts with PUCARA Aero and MYC, which supply major industry giants such as Boeing and Airbus. The startup has already received more than 3,300 pre-orders for its fully electric flying car, which is expected to be priced at around $300,000. Customers can place a pre-order on Alef’s website with a $150 deposit, or pay $1,500 to secure a spot in the priority queue. This partnership with the two airports could pave the way for Alef to introduce flying car fleets at these key hubs in the future. For the airports themselves, it marks progress toward embracing electric aviation. Still, the upcoming test operations go beyond that, showcasing not only the fusion of car and aircraft technologies but also advanced AI-driven safety systems similar to those used in autonomous vehicles. Alef is already building pre-production models in California, but customer deliveries are expected to begin next year.

Wednesday, August 27, 2025

Giant planet discovered

  124 light-years away from Earth, giant planet discovered 

Astronomers report the detection of a new Jupiter-like exoplanet using the High Accuracy Radial velocity Planet Searcher (HARPS). The newfound alien world orbits a nearby M-dwarf star designated GJ 2126. A nearby star has a new heavyweight companion, and it swings around on a path which is anything but neat. The world, labelled GJ 2126 b, traces a stretched orbit which pushes close to its star, then races far away again. Researchers identified a giant world 124 light years away that circles its star every 272.7 days on a highly stretched path, eccentricity equals 0.85. In addition, it has a minimum mass of about 1.3 Jupiter masses at roughly 0.71 astronomical units, about 66 million miles, from its star. These values come from 112 radial velocity measurements with the HARPS spectrograph. “This planet orbits a low-mass star and ranks among the most eccentric exoplanets discovered,” wrote Arbel Schorr from the School of Physics and Astronomy at Tel Aviv University (TAU) who led the study. 

The radial velocity (RV) method of detecting an exoplanet is based on the detection of variations in the velocity of the central star, due to the changing direction of the gravitational pull from an unseen exoplanet as it orbits the star. Thanks to this technique, more than 600 exoplanets have been detected so far. HARPS is a high-resolution visible-light echelle spectrograph installed at the European Southern Observatory (ESO) 3.6-m telescope in Chile. Thanks to its radial-velocity accuracy of about 1 m/s, it is one of the most successful planet finders in history. Most planets in our own neighbourhood move on nearly round routes, so an orbit this stretched stands out. High eccentricity often points to a chaotic past shaped by strong gravitational run-ins. A path like this can reshape a planet’s temperature and atmospheric behaviour across a single year. The extremes also make modeller's revisit how giant planets form and later get knocked around.

"We report the discovery of GJ 2126 b, a highly eccentric (e = 0.85) Jupiter-like planet orbiting its host star every 272.7 days. The planet was detected and characterized using 112 RV measurements from HARPS, provided by HARPS-RVBank," the researchers wrote. The team used HARPS which maintains about 1 meter per second velocity stability. The level of steadiness lets astronomers watch a star’s tiny wobble over many years. They mined the publicly curated HARPS-RVBank, which compiles 252,615 velocities for 5,239 stars observed before January 2022. Public datasets like this let independent teams test ideas and spot signals which earlier searches may have missed. With a semi-major axis of 0.71 AU and eccentricity of 0.85, periastron, the closest approach, sits near 0.11 AU, about 9.9 million miles. The farthest point stretches to roughly 1.31 AU, about 122 million miles. Those swings mean large changes in stellar heating across a single 272.7 day year. Timing, chemistry, and cloud formation likely shift dramatically between close pass and far turn.

The inclination of GJ 2126 b is unknown, its mass could be much greater and the possibility that this object may be a brown dwarf cannot be completely excluded. The host star, often listed as an M-dwarf of type M0V, is a cool, low-mass object with about 0.65 times the Sun’s mass and 0.73 times its radius. Its temperature sits near 4,159 kelvin and its metal content is high for a dwarf star. GJ 2126 is a high proper-motion star about 124 light years from Earth. Its brightness and proximity make follow-up work practical with existing instruments. Because the orbital tilt is unknown, the mass estimate is a lower limit. The team considered whether the companion might cross into brown dwarf territory if the orbit is nearly face-on. They argue that Gaia astrometry and the absence of long-term trends disfavor a very massive companion around this star. The paper reports a renormalized astrometric error near unity, a value not expected for a heavy hidden object. The researchers compared different ways to search for periodicity in uneven time series. They leaned on the Phase Distance Correlation periodogram, designed to handle non-sinusoidal signals like those from eccentric orbits. They also considered a known trap in velocity work, where two planets in a 2:1 resonance can masquerade as one eccentric planet. Their modelling rejected such alternatives for this dataset.

Astronomers underline that GJ 2126 b is one of the most eccentric exoplanets discovered around an M-dwarf. They added that the unique properties of GJ 2126 b place it in a relatively sparse region of the detected exoplanet population. Therefore, its further observations could help better understand planetary formation and evolution scenarios. Cool stars frequently show magnetic activity that adds noise to velocity data. Teams monitor spectral activity indicators to avoid mistaking star spots for planets and to validate true orbits. In this case, the auxiliary indicators did not line up with the 272.7 day signal. That mismatch supports a planetary cause rather than rotating surface features. When it comes to the host GJ 2126, it is a high proper motion star of spectral type M0V, with a radius of about 0.73 solar radii and a mass of around 0.65 solar masses. The star, which is estimated to be at a distance of approximately 124 light years away, has a metallicity at a level of 0.6 dex and its effective temperature is 4,159 K.

Their dataset spans about fifteen years around the critical phases of the orbit. The coverage anchored the fit and strengthened the case for a single object on an extreme path. Giant planets can acquire extreme eccentricities through planet-planet scattering after their birth in a gas disk. Numerical experiments show this process can drive e above 0.9 without invoking a distant stellar companion. As per these details, the system once hosted additional massive bodies that jostled each other until only one remained on a wild orbit. That would line up with the lack of a long-term drift in the present data. The planet’s radius is unknown because no transit has been seen in the available survey photometry. Without the tilt, the true mass remains uncertain, so further work aims to refine those values. Future velocity campaigns could detect subtle variations tied to mutual interactions, if any undiscovered companions exist. Continued astrometric monitoring may also tighten the mass constraints. Thermal measurements and reflected-light studies would be challenging, yet not out of the question for future facilities. The close approach near periastron may offer the best shot at characterization time. Long-baseline velocities will also test for secular changes which hint at additional bodies or tidal effects. A refined inclination would settle the mass question and finally close the door on the brown-dwarf scenario. Further observations of GJ 2126 b are required in order to determine its radius and to constrain its mass, which would shed more light regarding the composition of this exoplanet.

'New' island emerges in Alaska

  NASA satellite imagery shows 'New' island emerges from melting ice in Alaska Global warming is having effects all around the world...