All Activity
- Past hour
-
A new study introduces the Community Land Active Passive Microwave Radiative Transfer Modeling platform (CLAP)—a unified multi-frequency microwave scattering and emission model designed to revolutionize land surface monitoring. This cutting-edge platform combines active and passive microwave signals to offer potentially accurate simulations of soil moisture and vegetation conditions. By incorporating advanced interaction models for soil and vegetation, CLAP has the potential to address key limitations in existing remote sensing technologies, enabling the improvement of land monitoring precision. The study showcases CLAP's ability to improve microwave signal simulations, especially at high frequencies, marking a major step forward in ecosystem management and climate change research. Microwave remote sensing is essential for land monitoring, providing crucial insights into soil moisture and vegetation health by measuring the microwave radiation and backscatter emitted and scattered by the surface. However, current models rely heavily on zeroth-order radiative transfer theory and empirical assumptions, often overlooking dynamic changes in vegetation and soil properties (structure, moisture and temperature). These limitations result in inconsistencies and reduced accuracy across different frequencies and polarizations. Given these challenges, there is an urgent need for more refined research into the scattering and emission mechanisms of multi-frequency microwave signals to improve the precision and reliability of remote sensing technologies. A team of researchers from the University of Twente has published a paper in the Journal of Remote Sensing, introducing the Community Land Active Passive Microwave Radiative Transfer Modeling platform (CLAP) a multi-frequency microwave scattering and emission model, which integrates advanced soil surface scattering (ATS+AIEM) and vegetation scattering (TVG) models. CLAP incorporates appropriate vegetation structure, dynamic vegetation water content (VWC) and temperature changes, significantly improving upon existing technologies. Additionally, CLAP uncovers the frequency-dependent nature of grassland optical depth and highlights the significant impact of vegetation temperature on high-frequency signals, offering new insights for more accurate vegetation and soil monitoring. The core strength of CLAP lies in its detailed modeling of soil and vegetation components. The team used long-term in situ observations from the Maqu site, including microwave signals, soil moisture, temperature profiles, and vegetation data, to drive CLAP and evaluate the model performance respectively. Results showed that during the summer, CLAP with cylinder parameterization for vegetation representation simulated grassland backscatter at X-band and C-band with RMSE values of 1.8 dB and 1.9 dB, respectively, compared to 3.4 dB and 3.0 dB from disk parameterization. The study also discovered that vegetation temperature variations significantly affect high-frequency signal diurnal changes, while vegetation water content changes primarily influence low-frequency signals. For example, at C-band, vegetation temperature fluctuations had a greater impact on signal changes (correlation coefficient R of 0.34), while at S-band, vegetation water content had a stronger influence (R of 0.46). These findings underscore the importance of dynamic vegetation and soil properties in microwave signal scattering and emission processes, which CLAP accurately reflects. Dr. Hong Zhao, the lead researcher, commented, "The CLAP platform represents a major advancement in microwave remote sensing. By incorporating appropriate vegetation structure, dynamic vegetation and soil water content and temperature into the model, CLAP offers a more accurate representation of microwave signal scattering and emission processes. This innovation will significantly enhance our ability to monitor vegetation and soil conditions, providing more reliable data for ecosystem management and climate change research." The team utilized extensive in situ data from the Maqu site as well as satellite microwave observations. These comprehensive datasets allowed the researchers to rigorously assess CLAP's performance across various frequencies and polarizations, ensuring its accuracy and reliability. The development of CLAP opens new possibilities for the future of microwave remote sensing. This technology can be integrated into upcoming satellite missions such as CIMR and ROSE-L to enhance the precision of soil moisture and vegetation monitoring. Additionally, CLAP can be incorporated into data assimilation frameworks to provide more accurate inputs for land surface models. The widespread application of this technology promises to have a profound impact on global environmental monitoring, agricultural production, and climate change research, supporting sustainable development efforts worldwide. source: https://dx.doi.org/10.34133/remotesensing.0415
-
Researchers at Aalto University have, for the first time, investigated the occurrence of wolverines across the whole of Finland using satellite imagery, field measurements, and snow track observations. The wolverine, a predator typically found in the fells and forests of northern Finland, was classified as endangered in the country already in the 1980s. Although information on the species' historical range is limited, wolverines are known to have inhabited southern Finland as recently as the 19th century. Hunting caused the species to disappear from the region. This study, published in the journal Ecology and Evolution, is the first to provide nationwide data on the types of habitats favored by wolverines as they expand into new areas. "The species is returning to its historical range in southern Finland. According to our research, the deciduous-dominated mixed forests typical of the south may be more important habitats for wolverines than previously thought," says Pinja-Emilia Lämsä, a doctoral researcher at Aalto University. Despite recent population growth, the wolverine's survival remains threatened by its small population size, low genetic viability, and fragmented distribution. However, the study's use of remote sensing and field data offers vital information for safeguarding biodiversity. "Understanding habitats is essential for improving species conservation and management," says Professor Miina Rautiainen, a remote sensing expert at Aalto University. Fragmentation of forest landscapes poses a threat The study found that wolverines tend to favor large, forested areas with deciduous trees. They were rarely observed near recent clear-cuts, whereas older felling sites—about 10 years old—were more attractive. Wolverines also preferred areas with less dense tree cover. Previous studies on wolverine habitats and distribution have mainly focused on mountainous regions with vegetation that differs significantly from the low-lying boreal forests of Finland. According to Pinja-Emilia Lämsä, it is crucial to understand which habitats wolverines prefer specifically in Finland, where forestry practices strongly influence forest structure. "In Finland, the average forest compartment—a uniform section of forest in terms of tree species and site conditions—is relatively small. This can lead to a patchwork-like fragmentation of forest landscapes in forest management decisions. To protect wolverine habitats, mixed-species forest should be prioritized and large, continuous forest areas preserved," Lämsä says. Remote sensing reveals impacts of environmental change The study, conducted in collaboration with the Natural Resources Institute Finland (Luke), combined snow track counts of wolverines with national forest inventory data based on satellite images and field measurements. This approach allowed the researchers to examine the influence of forest characteristics on wolverine presence on a large scale. According to Rautiainen, remote sensing is an excellent tool for studying the distribution of animal species across broad areas, as satellite and aerial images provide increasingly detailed information about changes in forest landscapes and their impacts on wildlife. "In the future, remote sensing will enable us to monitor in even greater detail how, for example, changes in vegetation or other environmental factors in Finland affect animal populations," Rautiainen says. source: https://dx.doi.org/10.1002/ece3.71300
- Last week
-
We’ve all experienced that moment of frustration when the GPS glitches and you miss an exit on the highway. The team at Tern AI, which is building a low-cost GPS alternative, says that’s because the current technology is limited by its reliance on satellite positioning. Tern AI says it has figured out how to locate the position of a vehicle using only map information and a vehicle’s existing sensor data. The company’s pitch: It’s a cheap system that doesn’t require any additional expensive sensors. “No triangulation, no satellites, no Wi-Fi, nothing. We just figure out where we are as we drive,” Brett Harrison, co-founder and president, told TechCrunch while Cyrus Behroozi, senior software developer at Tern, loaded up the demo on his iPhone. “That’s really game changing because as we move away from triangulation-based, which limits technology, now we have the ability to be fully off that grid.” Harrison says this breakthrough is important for a number of reasons. From a commercial standpoint, companies that rely on GPS — including ride-hail apps to delivery companies — lose time, money, and gas every time their drivers have to double back because of faulty GPS positioning. More importantly, our most critical systems — from aviation to disaster response to precision farming — rely on GPS. Foreign adversaries have already demonstrated that they can spoof GPS signals, which could have catastrophic impacts both on the economy and national security. The U.S. has signaled that it wants to prioritize alternatives to GPS. During his first term, President Donald Trump signed an executive order to reduce reliance on a single source of PNT (positioning, navigation, and timing) services, like GPS. There are also several other initiatives which direct agencies and bodies like the Department of Defense and the National Security Council to ensure resilient PNT by testing and integrating non-GPS technologies. Testing Tern’s system in Austin To start the demonstration, Behroozi connected his 2019 Honda Civic to his phone via Bluetooth, allowing the Tern application to pull in data from the vehicle’s existing sensors. He noted that Tern’s tech can be integrated directly into vehicles, beginning model years 2009 and up. Usually, Tern sets the position manually to speed things up, but for our demo, the team wanted a “cold start.” Behroozi turned off his phone’s location services, so the Tern intelligent system had only a cached map of a 500-square-mile boundary around Austin and vehicle sensors to work with. As the car drove, the system picked up road data to work toward “convergence.” It took roughly 10 minutes for the system to reach full convergence from a cold start because, according to Behroozi, there was traffic so our movements were limited. Harrison assured me convergence usually takes around one to two minutes without a start point, and is immediate with one. Harrison noted that Tern’s system can also localize vehicles in parking garages, tunnels, and on mountains, which GPS struggles to do. Harrison wouldn’t explain exactly how, saying the information is “proprietary.” We drove around for a few more minutes after the system reached full convergence, and I watched as it steadily tracked our precise movements in a way that appeared as good as, and in some cases better than, GPS. That became more apparent when we drove into downtown Austin, where my Google Maps regularly mislocated me throughout the week as I navigated urban streets dotted with towering buildings. Harrison said that Tern’s system is also safer from a privacy perspective. “Our system is a total closed loop,” he said. “Right now, we’re not emitting anything. It’s independently deriving its own position [via on edge computing], so there are no external touchpoints.” source: techcrunch.com
-
Sony announced the AS-DT1, the world’s smallest and lightest miniature precision LiDAR depth sensor. Measuring a mere 29 by 29 by 31 millimeters (1.14 by 1.14 by 1.22 inches) excluding protrusions, the Sony AS-DT1 LiDAR Depth Sensor relies upon sophisticated miniaturization and optical lens technologies from Sony’s machine vision industrial cameras to accurately measure distance and range. The device utilizes “Direct Time of Flight” (dToF) LiDAR technology and features a Sony Single Photon Avalanche Diode (SPAD) image sensor. As Sony Semiconductor Solutions Corporation describes, a SPAD sensor promises exceptional photon detection efficiency, ensuring the sensor can detect even very weak photons emitted from the light source and reflected off an object. This efficiency is crucial, as reflected light is precisely how LiDAR works. Light Detection and Ranging (LiDAR) measures distances by measuring the time it takes for emitted photons to bounce off an object and return to the sensor. The more efficient the image sensor in terms of photon efficiency, the better its accuracy. Compared to the CMOS image sensors that photographers are familiar with, which detect light by measuring the volume of light that accumulates inside individual pixels over a specified time frame, SPAD sensors can detect a single photon — SPAD sensors digitally count photon particles without accuracy or noise issues. SPAD image sensors are fundamentally different and significantly more efficient than CMOS sensors. So why don’t all cameras use SPAD sensors? While they are very good at measuring single photons, they are not well-suited to measuring much more light, which nearly everyone wants to capture with a traditional camera. They are also costly, not high resolution, and inflexible. It was big news when Canon unveiled a one-megapixel SPAD sensor less than five years ago, to help illustrate where the technology is in terms of resolution. Sony does not say much about the specific SPAD sensor in its new AS-DT1 LiDAR Depth Sensor. There aren’t many SPAD sensors in Sony’s sensor catalog, but the few that are there are small and have relatively few pixels. Nonetheless, Sony is high on its new AS-DT1 device. Due to its small size and impressive SPAD sensor, the company says it is “ideal for applications where space and weight constraints are paramount, including drones, robotics, and more.” It is reasonable to suspect the device could also be helpful in self-driving cars. Any situation needing very accurate depth and distance measurements in challenging lighting scenarios is well-suited to something like the AS-DT1. “The AS-DT1 can measure distances to low-contrast subjects and objects with low reflectivity, which are more difficult to detect with other ranging methods. This enables accurate measurement of distances in diverse environments, such as retail stores, where various objects, including people and fixtures, are expected,” Sony explains. “In addition to its ability to accurately measure distances both indoors and outdoors, the sensor’s compact, lightweight design and rigid aluminum housing allow for integration into a wide range of devices, such as food service robots in restaurants, autonomous mobile robots in warehouses, and drones used for inspections and surveys.” The Sony AS-DT1 can measure at various distances with exceptional accuracy. For example, Sony claims it can measure the distance to objects 10 meters (32.8 feet) away with a margin of error of five centimeters (nearly two inches) indoors and outdoors. The company further claims the AS-DT1 is superior to competing imaging devices when dealing with low-contrast subjects, objects with low reflectivity, and floating objects. The AS-DT1 can accurately measure up to 40 meters (131.2 feet) indoors and 20 meters (65.6 feet) outdoors under bright summer conditions, which Sony says can be challenging “when inspecting infrastructure such as bridges, highways, and dams.” Given its small size and how valuable drones are for infrastructure inspection, this is a particularly attractive use case for the AS-DT1. source: petapixel
-
Australia’s Q-CTRL has announced the first real-world demonstration of its commercially viable quantum navigation system. The system works without Global Positioning Systems (GPS), cannot be jammed, and is already proving to be drastically more accurate than anything else. This is a big deal as many vehicles worldwide (including planes and cars) rely heavily on GPS for navigation. However, GPS can be jammed, spoofed, or even denied, especially during military conflicts or cyberattacks. This is a growing concern for national security and autonomous vehicles, which need constant, accurate location data. In fact, according to a press release by Q-CTRL, GPS jamming has been shown to disrupt around 1,000 flights every day. An outage on this scale is estimated to cost the global economy around $1 billion daily. Therefore, finding a reliable backup to GPS is critical, especially for defense and autonomous systems. Navigation without GPS To this end, Q-CTRL developed a new system called “Ironstone Opal,” which uses quantum sensors to navigate without GPS. It’s passive (meaning it doesn’t emit signals that could be detected or jammed) and highly accurate. Instead of relying on satellites, Q-CTRL’s system can read the Earth’s magnetic field, which varies slightly depending on location (like a magnetic fingerprint or map). The system can determine where you are by measuring these variations using magnetometers. This is made possible using the company’s proprietary quantum sensors, which are incredibly sensitive and stable. The system also comes with special AI-based software, which filters out interference like vibrations or electromagnetic noise (what they call “software ruggedization”). Q-CTRL ran some live tests on the ground and in the air to validate the technology. As anticipated, they found that it could operate completely independently of GPS. Moreover, the company reports that its quantum GPS was 50 times more accurate than traditional GPS backup systems (like Inertial Navigation Systems or INS). The systems also delivered navigation precision on par with hitting a bullseye from 1,000 yards. Technology now proven Even when the equipment was mounted inside a plane, where interference is much worse, it outperformed existing systems by at least 11x. This is the first time quantum technology has been shown to outperform existing tech in a real-world commercial or military application, a milestone referred to as achieving “quantum advantage.” Because of its stealthy, jam-proof, and high-precision nature, this tech is highly attractive to military forces, notably Australia, the UK, and the US. However, it could also prove valuable to commercial aviation companies, autonomous vehicles, and drones. It could be a game-changer for navigation in hostile environments, GPS-denied zones, or deep-sea/mountainous regions where GPS doesn’t work well. “At Q-CTRL, we’re thrilled to be the global pioneer in taking quantum sensing from research to the field, being the first to enable real capabilities that have previously been little more than a dream,” said Biercuk from Q-CTRL. “This is our first major system release, and we’re excited that there will be much more to come as we introduce new quantum-assured navigation technologies tailored to other commercial and defense platforms,” he added. source: interestingengineering
- Earlier
-
Are you ready to level up your geospatial skills? Join our comprehensive training course covering ArcMap, ArcGIS Pro, and ArcOnline—the essential tools for modern spatial analysis and programming! What You’ll Learn: Core functionalities of ArcMap & ArcGIS Pro Cloud-based mapping with ArcGIS Online Automating workflows with Python & ModelBuilder Creating interactive web maps & apps Who Should Enroll? GIS beginners & professionals Urban planners, environmental scientists, & data analysts Developers looking to integrate spatial programming Why Choose This Course? Hands-on projects & real-world applications Expert-led sessions & flexible learning Limited slots available! Click here to register. Let’s shape the future of spatial data together!
-
GPS is an incredible piece of modern technology. Not only does it allow for locating objects precisely anywhere on the planet, but it also enables the turn-by-turn directions we take for granted these days — all without needing anything more than a radio receiver and some software to decode the signals constantly being sent down from space. [Chris] took that last bit bit as somewhat of a challenge and set off to write a software-defined GPS receiver from the ground up. As GPS started as a military technology, the level of precision needed for things like turn-by-turn navigation wasn’t always available to civilians. The “coarse” positioning is only capable of accuracy within a few hundred meters, so this legacy capability is the first thing that [Chris] tackles here. It is pretty fast, though, with the system able to resolve a location in 24 seconds from cold start and then displaying its information in a browser window. Everything in this build is done in Python as well, meaning that it’s a great starting point for investigating how GPS works and for building other projects from there. The other thing that makes this project accessible is that the only other hardware needed besides a computer that runs Python is an RTL-SDR dongle. These inexpensive TV dongles ushered in a software-defined radio revolution about a decade ago when it was found that they could receive a wide array of radio signals beyond just TV. source: Hackaday and GitHub - chrisdoble/gps-receiver
-
ahahahah, just to make sure thank you
-
Of-course it is😅
-
Rocket Lab launched a synthetic aperture radar (SAR) imaging satellite for a Japanese company March 14, the first of eight such missions Rocket Lab has under contract with that customer. The Electron rocket lifted off from Pad B of Rocket Lab’s Launch Complex 1 at Mahia Peninsula, New Zealand, at 8 p.m. Eastern. The payload, the QPS-SAR-9 satellite, separated from the kick stage nearly an hour later after being placed into a planned orbit of 575 kilometers at an inclination of 42 degrees. The satellite is the latest for the Institute for Q-shu Pioneers of Space, Inc. (iQPS), a Japanese company with long-term ambitions to operate a constellation of 36 SAR satellites to provide high-resolution radar imagery. Rocket Lab announced in February two separate contracts with iOPS, each for four launches. Each launch would carry a single satellite. Six of the launches are scheduled for this year and the other two in 2026.This launch was the first under those contracts and the second overall for iQPS, after a launch of the QPS-SAR-5 satellite in December 2023. The launch is the third this year by Rocket Lab, with the next, carrying the final set of five Kinéis tracking satellites, scheduled for as soon as March 17. Rocket Lab said in an earnings call Feb. 27 that it was planning “more than 20” Electron launches this year, counting both orbital missions and those of its HASTE suborbital variant. “To hit scale is a really important part of the equation,” Brian Rogers, vice president of global launch services at Rocket Lab, said during a launch panel at the Satellite 2025 conference March 10. “Being able to hit cadence by any means necessary is the secret sauce.” source: SpaceNews
-
Maxar Intelligence developed a visual-based navigation technology that enables aerial drones to operate without relying on GPS, the company announced March 25. The software, called Raptor, provides a terrain-based positioning system for drones in GPS-denied environments by leveraging detailed 3D models created from Maxar’s satellite imagery. Instead of using satellite signals, a drone equipped with Raptor compares its real-time camera feed with a pre-existing 3D terrain model to determine its position and orientation. Peter Wilczynski, chief product officer at Maxar Intelligence, explained that the Raptor software has three main components. One is installed directly on the drone, enabling real-time position determination. Another application georegisters the drone’s video feed with Maxar’s 3D terrain data. A separate laptop-based application works alongside drone controllers, allowing operators to extract precise ground coordinates from aerial video feeds. “This system was designed to plug in and be a proxy for GPS,” Wilczynski said. The 3D terrain data is regularly updated, and Maxar can task its satellites to refresh information for specific regions of interest based on customer needs, he said. source: SpaceNews
-
nice, so this tool is like automation for all the 7 steps above? am i correct?
-
yousef2233 started following DEM2STREAM ArcGIS Desktop tool
-
Hi everyone This straightforward tool generates a stream order network from elevation data (DEM). The only input required is a DEM. As an open-source tool, it is accessible and easy to use. If you encounter any issues, feel free to contact me at [email protected]. Tested with ArcGIS Desktop 10.8.1😊 Let me know if you'd like further refinements Download Link To create a stream network and determine its order in ArcGIS starting with a Digital Elevation Model (DEM), follow these general steps: Step 1: Prepare the DEM Load your DEM into ArcGIS. Ensure the DEM is hydrologically correct, without any errors like sinks or pits. Use the Fill tool from the Spatial Analyst toolbox to fill these voids Step 2: Flow Direction Use the Flow Direction tool to compute the direction of water flow across the DEM surface. This creates a raster that assigns a flow direction to each cell Step 3: Flow Accumulation Apply the Flow Accumulation tool to calculate the amount of flow accumulated for each cell based on the flow direction raster Step 4: Stream Threshold Set a threshold value for the Flow Accumulation raster to define streams. The Con (conditional) tool can be used to extract cells that meet this threshold. This step essentially defines what qualifies as a stream Step 5: Stream Link Use the Stream Link tool to assign unique identifiers to connected stream segments. Step 6: Stream Order Apply the Stream Order tool to calculate the hierarchical order of streams (e.g., Strahler or Shreve order). Step 7: Vectorization (Optional) Convert the raster stream network to a vector format using the Stream to Feature tool. This makes the streams easier to visualize and analyze. With these steps, you'll have a stream network derived from your DEM with ordered streams that can be used for further hydrological analysis.
-
NASA and the Italian Space Agency made history on March 3 when the Lunar GNSS Receiver Experiment (LuGRE) became the first technology demonstration to acquire and track Earth-based navigation signals on the Moon’s surface. The LuGRE payload’s success in lunar orbit and on the surface indicates that signals from the GNSS (Global Navigation Satellite System) can be received and tracked at the Moon. These results mean NASA’s Artemis missions, or other exploration missions, could benefit from these signals to accurately and autonomously determine their position, velocity, and time. This represents a steppingstone to advanced navigation systems and services for the Moon and Mars. “On Earth we can use GNSS signals to navigate in everything from smartphones to airplanes,” said Kevin Coggins, deputy associate administrator for NASA’s SCaN (Space Communications and Navigation) Program. “Now, LuGRE shows us that we can successfully acquire and track GNSS signals at the Moon. This is a very exciting discovery for lunar navigation, and we hope to leverage this capability for future missions.” The road to the historic milestone began on March 2 when the Firefly Aerospace’s Blue Ghost lunar lander touched down on the Moon and delivered LuGRE, one of 10 NASA payloads intended to advance lunar science. Soon after landing, LuGRE payload operators at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, began conducting their first science operation on the lunar surface. With the receiver data flowing in, anticipation mounted. Could a Moon-based mission acquire and track signals from two GNSS constellations, GPS and Galileo, and use those signals for navigation on the lunar surface? Then, at 2 a.m. EST on March 3, it was official: LuGRE acquired and tracked signals on the lunar surface for the first time ever and achieved a navigation fix — approximately 225,000 miles away from Earth. Now that Blue Ghost is on the Moon, the mission will operate for 14 days providing NASA and the Italian Space Agency the opportunity to collect data in a near-continuous mode, leading to additional GNSS milestones. In addition to this record-setting achievement, LuGRE is the first Italian Space Agency developed hardware on the Moon, a milestone for the organization. The LuGRE payload also broke GNSS records on its journey to the Moon. On Jan. 21, LuGRE surpassed the highest altitude GNSS signal acquisition ever recorded at 209,900 miles from Earth, a record formerly held by NASA’s Magnetospheric Multiscale Mission. Its altitude record continued to climb as LuGRE reached lunar orbit on Feb. 20 — 243,000 miles from Earth. This means that missions in cislunar space, the area of space between Earth and the Moon, could also rely on GNSS signals for navigation fixes. source: NASA
-
Dear, could someone share again the link to download this soft please. best regards,
-
apa yang dilakukan sampai keluar error itu kalo dari errornya sekilas terkait masalah lisensi error, kemungkinan jamunya ndak bagus
-
Mohon informasi mengenai cara menghilangkan window seperti pada gambar https://photos.app.goo.gl/KUCfUyucJLu7NtXG8
-
Foto yang menggambarkan komparasi perubahan pada 26 April 2022-19 Februari 2024 tersebut sejatinya dirilis pada 19 Februari 2024 lalu Penyusutan tutupan hutan atau deforestasi di wilayah IKN ini juga dicatat oleh Forest Watch Indonesia (FWI). Dalam kurun waktu 3 tahun (2018-2021), deforestasi di wilayah IKN mencapai 18.000 hektar, dengan 14.010 hektar di antaranya berada di hutan produksi, 3.140 hektar di Area Penggunaan Lain, sisanya 807 hektar di Taman Hutan Rakyat (Tahura), 9 hektar di Hutan Lindung, dan 15 hektar di area lainnya. Catatan FWI (2023) menerangkan, sepanjang 2022 dan sampai Juni 2023 luas areal terdeforestasi mencapai 1.663 hektare. Terkait hal ini, Direktur Pengembangan Pemanfaatan Kehutanan dan Sumber Daya Air Otorita IKN Pungky Widiaryanto mengakui, isu perubahan tutupan hutan di Kalimantan, khususnya IKN, memang menjadi perhatian banyak pihak, baik yang mendukung maupun yang mengkritisi. Namun demikian, Pungky merasa perlu untuk memberikan klarifikasi, agar pemahaman masyarakat menjadi lebih baik. Bahwa, kondisi awal area IKN sebelum pembangunan dimulai pada 2022, didominasi oleh hutan tanaman industri, terutama pohon eucalyptus. Pertumbuhannya yang cepat dan siklus panen yang singkat, menjadikannya pilihan utama dalam hutan tanaman industri. "Oleh karena itu, perubahan yang terlihat dari citra satelit mungkin mencerminkan aktivitas pengelolaan hutan tanaman industri yang sudah ada sebelumnya," terang Pungky kepada Kompas.com, Selasa (28/1/2025). Sementara, IKN dirancang dengan prinsip keberlanjutan sebagai prioritas utama. Dari total area yang ada seluas 252.660 hektar, hanya 25 persen yang akan digunakan untuk bangunan, fasilitas, dan infrastruktur. Sebagian besar wilayah lainnya atau 75 persen, akan dihijaukan kembali dengan berbagai jenis pohon khas Kalimantan, bukan hanya eucalyptus. Strateginya adalah menggunakan pohon eucalyptus yang ada sebagai naungan bagi tanaman baru. Ketika eucalyptus mati, pohon-pohon khas Kalimantan akan siap tumbuh dengan baik. Sejak tahun 2022 hingga saat ini, reforestasi telah terlaksana di area seluas 8.420 hektar di wilayah delineasi IKN. Penanaman ini melibatkan berbagai pihak, termasuk instansi pemerintah, perusahaan swasta, yayasan, dan perguruan tinggi, dalam pengelolaan rimba kota. Pungky mengakui bahwa target mengubah 65 persen dari luas area IKN menjadi kawasan lindung dengan tutupan hutan hujan tropis merupakan target ambisius. "Ini adalah upaya besar yang memerlukan dukungan dari semua kalangan. Kami mengajak seluruh masyarakat untuk berpartisipasi dalam upaya reforestasi ini," imbuh Pungky. Untuk itu, Kedeputian Bidang Lingkungan Hidup dan Sumber Daya Air pun mengembangkan mekanisme pendanaan yang memiliki potensi besar untuk mendukung target reforestasi. sumber: Kompas