Fluid Monitoring & Detection | Envirotec https://envirotecmagazine.com Technology in the environment Fri, 12 Jan 2024 15:37:51 +0000 en-US hourly 1 Turbine flow meters help dispense clean fuel https://envirotecmagazine.com/2024/01/12/turbine-flow-meters-help-dispense-clean-fuel/ Fri, 12 Jan 2024 15:37:51 +0000 https://envirotecmagazine.com/?p=480001 TitanThis article contains paid for content produced in collaboration with Titan.

KOKO Networks in East Africa has created a scalable solution to the dirty fuels problem. The clean ethanol specialist’s distribution model uses bioethanol as an ultra-clean, cheap and safe alternative to the more traditional cooking fuels of charcoal and paraffin. And its solutions incorporate Titan’s NSF-Approved 800 series flowmeters.

Designed and built by KOKO Networks (Nairobi), the KOKO Point vending machine dispenses ethanol and is part of an elegant solution to a widespread problem in African cities: Getting cheap, safe fuel into homes to use for cooking. Across Africa, where electricity connections are patchy and mains gas non-existent, cooking fuel has been dominated by charcoal and paraffin, both of which emit fumes and soot, contributing to thousands of premature deaths every year. Made from corn or sugar cane, ethanol is much safer but can be hard to package and transport.

KOKO Networks has installed over 2,000 KOKO Point vending machines to date in corner stores across Kenya, including Mombasa and Nairobi. Drawn by the alure of this safe, clean cooking fuel, customers dispense it into reusable KOKO fuel bottles and, once at home, they connect the bottles to KOKO-designed twin-hob stoves, which have an airlock to limit fumes and prevent spillage.

Neil Hannay, Senior R&D Engineer at Titan Enterprises commented: “We were approached by KOKO Networks to work with them on this life changing development for African families. KOKO Networks required a highly reliable and inexpensive flow meter capable of precisely dispensing batches of ethanol cooking fuel. As a result of our joint development, at the heart of each KOKO Point ethanol vending machine is a precision Pelton wheel turbine flowmeter that rotates freely on robust sapphire bearings, combining materials and technology to ensure a long-life product with reliable operation throughout.”

Titan’s NSF-Approved 800 series turbine flow meters are ideally suited for applications requiring both food hygiene and precision flow measurement.

]]>
Fluorescent approach aims to combat honey fraud https://envirotecmagazine.com/2023/10/04/fluorescent-approach-aims-to-combat-honey-fraud/ Wed, 04 Oct 2023 15:37:19 +0000 https://envirotecmagazine.com/?p=477562

A new technology based on fluorescence is being developed to detect if honey has been blended with cheaper additions such as sugar syrup.

The brainchild of Aston University and The Scottish Bee Company – With help and support from the British Beekeepers Association and the Honey Authenticity Network UK – the approach appears to offer a solution to this, the second most prevalent form of food fraud worldwide (after milk-related fraud).

Earlier this year the EU study “From the Hives” reported that of 10 samples it had collected from honey exported from the UK, all were flagged as suspected of being adulterated. Although blended or packaged in Britain, the honey might have originated from overseas.

Researchers at the University’s Aston Institute for Photonic Technolgies (AIPT) and The Scottish Bee Company have been awarded an artificial intelligence feasibility study grant from Innovate UK, part of UK Research and Innovation to develop honey authenticity testing technology.

Honey samples will be examined using a light-based technique called FLuorescence Excitation-Emission (FLE) spectroscopy combined with machine learning to create a fast and reliable testing method.

Current techniques, such as chromatography, nuclear magnetic resonance and sensory analysis are expensive and time-consuming.

The University team is led by Dr Alex Rozhin, reader in nanotechnology within Aston Institute of Photonic Technologies (AIPT). He said: “The project aims to enhance consumer confidence, elevate product value, and safeguard the reputation of British honey.

“The team will utilise advanced machine learning techniques, specifically parallel factor analysis (PARAFAC) and partial least squares discriminant analysis (PLS-DA), to analyse honey samples.

“These techniques offer the capability to identify the chemical constituents, assess their ratios, and determine quality markers within the samples.

“By combining the rapidity, precision, and optical capabilities of FLE spectroscopy with machine learning algorithms, the project aims to surpass existing state-of-the-art methods for assessing the quality of honey.”

Fluorescence (FL) spectroscopy used for rapid food quality assessment
The Scottish Bee Company will be sharing its expertise in honey production to inform the research. Iain Millar, director of The Scottish Bee Company said “Honey fraud isn’t just a consumer rights issue, it has far reaching implications for food security, land use and biodiversity.

“Aston University has a wealth of expertise in the area of food fraud, and we are glad to be working with their team and our colleagues at the BBKA to protect the integrity of the UK honey market.”

The also project involves Lynne Ingram, master beekeeper and founder of Honey Authenticity Network UK (HAN UK). She said: HAN UK has been campaigning for years to raise awareness of the issue of honey adulteration globally and in the UK.

“We are delighted to be working with Aston University, The Scottish Bee Company, and the British Beekeepers Association on the development of this exciting new test, that will go some way to overcoming the problem of honey fraud in the UK.”

The British Beekeepers Association (BBKA) will lend its expertise and help with collection of honey samples. Julie Coleman, master beekeeper and chair of the research, technical and environmental committee of the BBKA said: “Adulterated and fake honey is of great concern to our members and we are pleased for the opportunity to collaborate with all partners to collect and supply authentic honey for analysis.

“We hope that our samples can be used to prove the methodology and application of this exciting new technology.”

The research will take place at Aston University and will involve team members Dr Raghavan Chinnambedu-Murugesan, Dr Steve Daniels and Dr Valentina Barker.

Director of AIPT Professor Sergei Turitsyn said: “I am delighted for Alex and the team he leads.

“Research into the applications of photonics in food industry and agri-tech in AIPT started from the establishing of The Wolfson Centre for Photonics for Food and Agri-Tech funded by the Wolfson Foundation.

“This area was further supported by the Garfield Weston Foundation. We are extremely grateful to these charities for their support and we will continue to deliver for society, science and UK industry.”

The current project is expected to continue for six month and to translate into a UK-wide project that will create a comprehensive database of UK honey samples and develop portable instruments for honey sampling and detection.

]]>
Firm appointed by EA to research PFAS from landfill https://envirotecmagazine.com/2023/04/05/firm-appointed-by-ea-to-research-pfas-from-landfill/ Wed, 05 Apr 2023 15:44:02 +0000 https://envirotecmagazine.com/?p=472145 landfill

The EA’s PFAS Risk Screening Project will undertake assessments which may inform future regulatory measures, and it has appointed professional services firm GHD to help with the associated research.

As part of Phase 4 of the project, GHD will work with the EA to help assess and characterise the risks of per- and polyfluoroalkyl substances (PFAS), persistent organic pollutants (POPs) and other hazardous substances in UK landfill emissions (leachate and gases). The objective of the work is to improve overall understanding of the scale and extent of such substances from landfill emissions and their potential migration pathways and impact on the terrestrial and marine environments.

Through monitoring at select landfills and wastewater treatment plants across England, GHD will test various media for the presence and magnitude of PFAS and POPs contaminants, and then interpret the data for emissions characterisation and look at the effectiveness of leachate treatment plants and associated wastewater treatment works in contaminant removal. The information gathered will be used by the EA to develop regulatory tools to ensure guidance for environmental monitoring at landfill sites is fit for purpose.

GHD wil work with its partners Enitial, a specialist monitoring firm with experience at landfill and water treatment sites.

]]>
Measuring oil contamination in soil https://envirotecmagazine.com/2023/02/08/measuring-oil-contamination-in-soil/ Wed, 08 Feb 2023 11:11:47 +0000 https://envirotecmagazine.com/?p=470766 soil

Petroleum hydrocarbons in soil continues to be an area of interest for scientists as they are the most common contaminants that are toxic to human and environmental receptors. These contaminants include various molecules that are grouped into aliphatic and aromatic hydrocarbons. Generally, initial field-based soil analysis is non-specific and, while useful, the results are normally supplemented with those of more accurate and precise lab-based techniques. Here Paul Vanden Branden, director and product manager at laboratory equipment supplier SciMed, discusses the performance of field-based non-dispersive infra-red spectroscopy (NDIRS) technology for petroleum hydrocarbon determination compared to traditional lab-based methods.

Typically, the analysis of oil-contaminated soils during site investigation and remediation involves a range of non-specific field-based screening techniques and specific lab-based fingerprint techniques completed off-site by commercial laboratories using certified analytical methods.

Typical non-specific field-based techniques include NDIRS, portable gas chromatography coupled with mass spectroscopy (GC-MS), ultra-violet fluorescence spectroscopy (UVFS), visible-near infrared (vis-NIR) spectroscopy, Fourier-transform infrared (FTIR) spectroscopy and photo-ionisation detection (PID). These are used to screen total petroleum hydrocarbons (TPH), quantify aliphatic and aromatic hydrocarbons during site investigation, identify potential hydrocarbon concentration hotspots and compare TPH concentrations in the environmental.

Generally, the lab-based fingerprint techniques used are either GC-MS or high-performance liquid chromatography coupled to mass spectrometry (HPLC-MS). These provide in-depth data into aliphatic and aromatic hydrocarbon speciation, qualitative and quantitative hydrocarbon degradation due to weathering or engineered remediation, and have high sensitivity and accuracy for risk indicator compounds, which is required to meet regulatory requirements.

While the lab-based fingerprint techniques offer high accuracy and precision, the procedures involved are often time-consuming and expensive. Therefore, they are not seen as cost-effective methods for decision making needed during site investigation, remediation monitoring and validation.

Over the past decade, various field-based analytical technologies have been developed, expediting hydrocarbon determination on site and increasing the number of soil samples that can be analysed at lower cost. However, more data is needed before these can be widely adopted, such as their performance and accuracy for different soil types, different levels of contamination and different fuel types. Furthermore, comparison of their ability to quantify different hydrocarbon groups for risk assessment purposes and evaluation of whether they can offer a good alternative to lab-based technologies for remediation monitoring and validation is still limited.

Field-based NDIRS
In analytical chemistry, extraction procedures aim to separate the analyte quickly, quantitatively and using as little solvent as possible. A recent report by Concawe outlined how solvent-based extraction field technologies, including NDIRS, performed well for the detection and quantification of TPH between 100 and 10,000 mg kg-1, independent of soil type and fuel type. The NDIRS instrument used in Concawe’s study was the Infracal 2 ATR-SP TPH analyser, which can perform TPH determination of hydrocarbons in five to ten minutes, significantly faster than lab-based GC-MS.

For the experiments covered in Concawe’s report, scientists used a hexane extraction method, adding 1% v/w to soil samples and shaking them for two minutes. This extract was cleaned using activated silica gel and Whatman no. 40 filter paper. For level three spiked soils, the extracts were further diluted five times in accordance with the Infracal 2’s detection range. Before each measurement, the attenuated total reflectance (ATR) crystal was cleaned with 99.9% isopropanol and zeroed every hour. To measure a sample, 60 μL was deposited into the ATR crystal, and the solvent was given time to evaporate before the measurement was taken.

The report concluded that the NDIRS field test provides GC-MS comparable TPH recoveries and meets the performance requirements for many regulatory standards. This means scientists can use the Infracal 2, which SciMed supplies, to conduct cost-effective petroleum hydrocarbon analysis on-site, without outsourcing to commercial analytical labs.

To speak to one of SciMed’s team about how the Infracal can help you in your application, fill out an enquiry form on the company’s website.

]]>
Sponsored Content: Flow meter accuracy is critical to biofuel production https://envirotecmagazine.com/2023/01/15/sponsored-content-flow-meter-accuracy-is-critical-to-biofuel-production/ Sun, 15 Jan 2023 11:00:28 +0000 https://envirotecmagazine.com/?p=470218

Titan Enterprises’ OEM customer Green Fuels has been involved with advancing biofuel technologies – expected to be an important contributor to the global net zero target.

For 13 years, Titan’s Oval Gear (OG) flow meters have been an intrinsic component within Green Fuels’ biodiesel plants converting biowaste into sustainable biofuels. Green Fuels’ award-winning research and pioneering process plants have powered buildings with sustainable fuel from used cooking oil, and most recently developed a process to convert fish oil and sewage sludge to biofuel. Jason Askey-Wood, Green Fuels’ UK MD, said: “Accuracy of measurement in our chemical processes is critical and Titan’s range of oval gear flowmeters provide just that. Simple, accurate and chemically resistant, Titan’s OG flow meters, with the addition of the Namur switch, were the ideal solution for precise batch dosing within a potentially hazardous environment.”

The OG Series flow meters are designed to give reliable, high performance across a wide range of applications. Almost immune from the effects of varying liquid viscosity, density and temperature, the measurement performance improves as liquid viscosity increases.

Intrinsically safe options provide safe operation in potentially explosive atmospheres. They are constructed from durable materials including stainless steel, Hastelloy and PEEK. A fully-certified NAMUR proximity switch can be installed.

Jeremy Thorne of Titan Enterprises said: “The OG flow measurement devices are ideal for use with aggressive chemicals, combining durable materials, robust design and proven technology to ensure they will have a long product life with reliable, accurate operation throughout.” Jason added: “It’s an exciting time for Green Fuels with carbon reduction and climate change at the top of the worlds’ agenda. We have seen a shift in our business in developing joint ventures with both advanced economies such as Oman and developing countries, working together to find sustainable fuel solutions for the Transport and Energy sectors as they look to decarbonise and reduce harmful emissions.”

flowmeters.co.uk/oval-gear-flow-meters/

]]>
Uncrewed vessel returns from volcano caldera survey in Tonga loaded with ‘astounding’ data https://envirotecmagazine.com/2022/08/03/uncrewed-vessel-returns-from-first-volcano-caldera-survey-in-tonga-loaded-with-astounding-data/ Wed, 03 Aug 2022 12:35:27 +0000 https://envirotecmagazine.com/?p=467202 Aerial-view-of-Hunga-Tonga
Aerial view of the Hunga-Tonga Hunga-Ha’apai (HT-HH) volcano, showing new multibeam depth data overlaid on a satellite image of the islands (deep depths in blue, shallow depths in red). Credit: SEA-KIT / NIWA-Nippon Foundation TESMaP survey team.

A plethora of data and imagery obtained using an Uncrewed Surface Vessel (USV) is filling important gaps in scientists’ understanding of the HungaTonga Hunga-Ha’apai (HT–HH) undersea volcano in the South Pacific, the site of an eruption that began in December 2021.

The Uncrewed Surface Vessel USV Maxlimer – from UK firm SEA-KIT International – has returned from an initial survey mission inside the caldera of the volcano. The vessel is equipped with a Multibeam Echo Sounder (MBES) to acoustically measure depth and the state of the seabed. Importantly, the vessel also has new winch capability for deployment of multiple sensors down to 300 metres to obtain direct water column measurements.

Data collected using a Conductivity Temp Depth (CTD) instrument, deployed using the winch, is providing important temperature and salinity information as well as dissolved oxygen and turbidity readings. A Miniature Autonomous Plume Recorder (MAPR), designed to detect chemicals in the water that are common in hydrothermal plumes, is recording light-backscattering for suspended particle concentrations, oxidation reduction potential, temperature and pressure from multiple winch dips and tows. The MAPR project is a joint initiative between NOAA in the USA and GNS Science in New Zealand.

On this survey mission, clear signs of continuing volcanic activity were seen inside the crater, with high particle concentrations in the water that are consistent with earlier observations of ash in the water column. The water in the caldera was also found to be homogenous between 150 and 300 metres depth, suggesting strong mixing inside the caldera. The previously observed oxygen minimum was not as distinct.

Sharon Walker, Oceanographer at NOAA’s Pacific Marine Environmental Laboratory said: “Early data shows ongoing activity within the caldera, though it is too early to tell if it is due to continuing eruption but at a reduced intensity, or hydrothermal venting driven by cooling lava, or both.”

The work is part of the second phase of the NIWA/Nippon Foundation Tonga Eruption Seabed Mapping Project – TESMaP, funded by The Nippon Foundation.

Maxlimer is returning to the caldera this week for a more detailed survey to fill gaps and better target volcanic plumes and hydrothermal vents with the CTD and MAPR. Dr Mike Williams, Chief Scientist-Oceans at NIWA said: “The primary objective of phase 2 is to map the caldera and the hydrothermal vents within it and Maxlimer is crucial in achieving that. It is incredibly exciting to be able to look down into the caldera and see volcanic plumes. We now know that at its deepest point it is around 850 metres deep, so more than the height of two and a half Eiffel Towers. The data and imagery that Maxlimer has brought back is astounding and is helping us to see how the volcano has changed since the eruption.”

SEA-KIT-USV-Maxlimer-returning-from-HT-HH-caldera-in-Tonga
USV Maxlimer returning from the Hunga-Tonga Hunga-Ha’apai (HT-HH) caldera. Credit: SEA-KIT International.

The 12-metre USV is being remotely controlled on its caldera missions from SEA-KIT’s base in Essex, where a team of four operators work shifts for round-the-clock. A global team of surveyors and scientists based in Australia, Egypt, Ireland, Mauritius, New Zealand, Poland and the USA are collaboratively monitoring and reviewing the data collected.

The UK’s Maritime Minister, Robert Courts, said: “I’m delighted that maritime technologies produced here in the UK are being used to understand the effects of the tragic Tonga volcanic eruption, all whilst being controlled remotely 16,000 kilometres away in the UK. This research will help protect the 680 million people living in coastal areas by providing a better prediction of similar natural disasters in the future and ensure that we are prepared if this does happen again.”

USV Maxlimer is also scheduled to map areas where telecommunications cables were damaged following HT–HH’s violent eruption in January 2022. During Phase 1 of the project, the crew of NIWA’s research vessel, RV Tangaroa, discovered that the severed domestic internet cable was buried under 30 metres of ash and sediment. Project and local stakeholders hope to gain a better understanding of the extent of the damage and how it was caused – likely due to fast-moving pyroclastic flows – from Phase 2. SEA-KIT’s USV may also be deployed to survey alternative sites suitable for replacement cables if needed before returning to the UK.

Ben Simpson, SEA-KIT CEO, said: “Maxlimer was the first SEA-KIT X-class USV to be built. There are now numerous X-class vessels working commercially around the world, but this project clearly demonstrates how crucial this technology is as a low-risk, non-invasive solution to reach, survey and understand places that are challenging or unsafe for people to access. Maxlimer experienced 3 metre seas on return from this first mission, which proves yet again how robust the design is. Demonstrating that this kind of work can be done using less than 2% of the fuel of a typical survey vessel is also a significant step on the industry’s path towards net zero emissions.”

Knowledge gained from the project will be invaluable to the Tongan authorities in preparing and planning for future possible eruptions. The Nippon Foundation’s Executive Director Mitsuyuki Unno said: “Maxlimer’s technology has enabled our Nippon Foundation-GEBCO alumni to not only gain hands-on experience but also to come together as a team. We look forward to sharing the results of this voyage in the coming months and to exploring ways in which the findings can benefit the people of Tonga.”

Lucy Joyce, British High Commissioner to the Kingdom of Tonga, said: “I’m delighted to see SEA-KIT’s Maxlimer Uncrewed Surface Vessel working here in Tonga. The assessment of the Hunga-Tonga Hunga-Ha’apai caldera and the waters and seabed around it is of huge interest and importance to all of us here in Tonga who lived through the huge explosion and tsunami in January.”

Seabed data gathered by USV Maxlimer will also contribute to The Nippon FoundationGEBCO Seabed 2030 Project, a United Nations Ocean Decade endorsed programme, which aims to map the entire ocean floor by 2030. Jamie McMichael-Phillips, Seabed 2030 Project Director, said: “The use of USVs is paramount to Seabed 2030’s mission of a complete map of the entire ocean floor. By surveying areas that may not be accessed by personnel for safety reasons, USVs, such as SEA-KIT’s Maxlimer, will greatly assist us in our ambitious goal.”NIWA-Nippon Foundation Tonga Eruption Mapping Project Findings from the TESMaP project will be presented at the American Geophysical Union (AGU) Fall Meeting in Chicago this December, a highly influential event dedicated to the advancement of Earth and space sciences.

]]>
Sponsored Content: A technology for the timely warning https://envirotecmagazine.com/2021/09/09/sponsored-content-a-technology-for-the-timely-warning/ Thu, 09 Sep 2021 13:57:31 +0000 https://envirotecmagazine.com/?p=460105 Weather monitoring station

Robin Guy from OTT HydroMet believes that the worst effects of extreme weather can be mitigated by effective monitoring and alarm technologies. “Intense rainfall can be the precursor to flooding, and many of the monitoring networks that we supply include real-time precipitation monitors.”

“However, it is also necessary to monitor groundwater and surface water levels because flooding can be caused by a variety of other issues such as high tides in coastal areas, blocked or poorly designed drains and culverts, and even snow melt.”

Natural Flood Management (NFM) is now regarded as a more effective and sustainable strategy than traditional methods involving engineered flood defence infrastructure. Under NFM flood risk is addressed on a wider catchment scale so that upstream initiatives do not have negative effects further downstream.

This catchment based approach has meant that water managers are increasingly looking for solutions that employ techniques which work with natural hydrological and morphological processes and features to manage flood waters. These NFM techniques include the restoration, enhancement and alteration of natural features such as flood plains, which can help to lower peak flow and decrease flood risk. Both traditional and NFM measures can benefit enormously from the latest monitoring and communications systems.

The latest meteorological and hydrological monitoring networks employ robust sensors that are able to operate remotely on extremely low power levels with long intervals between service or calibration.

As Guy says, these technologies “are able to deliver timely warnings for extreme events; creating windows of opportunity for mitigation measures to protect lives and important assets.”

ott.com

]]>
Sponsored Content: Water managers demand better, faster data https://envirotecmagazine.com/2021/02/19/sponsored-content-water-managers-demand-better-faster-data/ Fri, 19 Feb 2021 09:35:33 +0000 https://envirotecmagazine.com/?p=316062

This article contains paid-for content created in collaboration with OTT Hydrometry Ltd

2020 was an unusual year in many respects, but for OTT Hydromet, one of the more remarkable features of the year was an unprecedented level of orders for water level and flow monitoring equipment. Amid increasing pressure from climate change and urbanisation, OTT’s UK Managing Director Nigel Grimsley explains the growing demand for monitoring technology that enables more effective water management.

Looking back

In 1978 OTT launched the Allgomatic – a large desktop device for ‘data measurement, storage and transmission,’ – it wasn’t even called a datalogger. Prior to the Internet and mobile phones, OTT’s customers were monitoring water level in boreholes and stilling wells with dip tapes or sensors such as shaft encoders and compressed air bubblers connected to displays, dataloggers or chart recorders. Discharge calculations were based on stage measurements, possibly with a point and hook gauge, combined with manual measurements of velocity with a rotating element current meter. Raingauges were mostly operating with a collection vessel or with a tipping bucket mechanism; recording tips with a connected datalogger. Inevitably, this provided a retrospective view of water resources that was mainly useful for detecting trends and informing models.

Flood protection measures at this time were often designed to increase channel capacity and/or divert water away from sensitive areas. This may have involved engineered flood defence infrastructure with higher flood defences and straighter channels. Flood plains were increasingly used for agricultural, commercial or residential development, and the creation of ever-growing areas of impervious concrete and asphalt increased flood risk at a time when climate change was starting to increase the frequency and severity of severe weather.

Looking forward

Jump forward to today and flood management has changed dramatically – Natural Flood Management (NFM) is now regarded as a more effective and sustainable strategy. Flood risk is now addressed on a wider catchment scale so that upstream initiatives do not have negative effects further downstream. This catchment based approach has meant that

water managers are increasingly looking for solutions that employ techniques which work with natural hydrological and morphological processes and features to manage flood waters. These NFM techniques include the restoration, enhancement and alteration of natural features such as flood plains. However, there are many other ways in which NFM can help to lower peak flow and decrease flood risk. Tree planting for example, intercepts rainfall and overland flow, increases water infiltration into soil, and provides important connective habitat for wildlife, as well as providing increased carbon storage.

Tomorrow’s technology

In addition to the provision of baseline data, monitoring systems are also necessary for measuring the effectiveness of mitigation measures and providing timely warnings when alarm conditions arise. In order to better understand catchments, groundwater and surface water levels should be correlated with upper catchment monitoring which includes the measurement of meteorological parameters.

The demand for high intensity monitoring is growing rapidly, as water managers demand lower levels of uncertainty in their monitoring data, and seek to exploit the advantages of networks of continuous monitors. The high volumes of orders that OTT received in 2020 featured many networkable sensors such as the OTT ecoLog 1000; a self-contained surface and groundwater level logger with two-way communication for smart phones and tablets. Remote sensors were also highly popular, including the OTT SVR 100 surface water velocity radar and the OTT RLS non-contact radar level sensor. The volume of dataloggers with satellite communications capability is also increasing as network managers seeks to exploit the benefits of reduced data transmission costs.

The demand for better monitoring is driving the development of OTT’s latest hydrometeorological technologies, which can be summarised in four categories:

1. Accurate, reliable, low-power, smart sensors

2. Flexible, intelligent, easy-to-use dataloggers

3. Reliable, appropriate, cost-effective telemetry with multiple transmission options and redundancy capability

4. Customisable, easy-to-use, insightful, web-enabled, data management software

With the benefit of insights from comprehensive real-time monitoring networks, collecting high intensity data from multiple points, water managers will be able to make better informed, defensible decisions.

]]>
Sponsored Content: Improving control and yield in gravity thickeners https://envirotecmagazine.com/2021/02/08/improving-control-and-yield-in-gravity-thickeners/ Mon, 08 Feb 2021 16:57:32 +0000 https://envirotecmagazine.com/?p=308214
Instrumentation firm VEGA Controls looks at performance issues that can arise with an aspect of automation that is a popular fixture in the water industry

Gravity thickener machines are widely installed and used in wastewater treatment plants, water treatment plants as well as in a wide range of industries with waste or solids separation requirements. These include: food and dairy, pulp and paper, abattoirs and special aggregates industries.

The machines are primarily used for water-volume reduction prior to processes like digestion, dewatering/ drying, and transportation for incineration or disposal. They are continuously operating machines which thicken sludge by gravity using a revolving porous filter belt or conveyor. And they generally produce a still pumpable, but much thickened sludge.

5B_0514

A typical gravity belt thickener out-feed. Level controls over the outfeed hopper regulate the pump speed and operation, and the feed into the gravity belt thickener.

They employ the natural tendency of higher-density solids to settle out-of-liquid, employing some additional mechanical assistance to this process of concentration. Think of the example in a kitchen of passing a stock through a sieve, using a spoon to stir and agitate the process to encourage more liquid to pass over a wider area and over cleaner parts of the sieve.

The technique is used in the same way in both industrial and municipal settings. A gravity belt thickener employs ‘gravity drainage’ whereby dilute sludge (typically 0.5% to 1.0%) is introduced at the feed-end of a horizontal filter belt. As the slurry makes its way down the moving belt, any free water drains through the porous belt. The solids are continuously turned, encouraging the drainage of more water through a filter belt to thicken. A drum thickener works on a similar principle of conveying treated sludge (aka flocculated sludge) through a slowly rotating drum filter. The sludge remains in the drum, while the water phase passes out through the filter.

Some ‘dynamic’ systems also employ a partial vacuum or pressure to assist with the filtering to accelerate or optimise the process. There are also presses and centrifuges which carry out similar tasks, all with similar process challenges and using varying control techniques. In most cases, the final sludge is discharged into an outlet hopper as a pumpable, thickened sludge.

The sludge to be thickened may be polymer-conditioned prior to digestion. Or it could be prepared for a further process of mechanical dewatering in a press or centrifuge. Or it might be water-reduced before it is transported to a sludge disposal site or an area of land where it is to be applied as a fertilizer.

Controlling the thickening process
Control of these mechanised processes is key to producing a consistent throughput and output; too fast and the filter systems can be overwhelmed; too little tends to waste energy and capacity, and can have a detrimental effect on the quality of the final sludge.

To do this, the machines employ a variety of level and pressure sensors – among other devices – to support control of the process and provide feedback on how it is running. For example, in-line pressure sensors and pumps, differential pressure sensors on filters or across pumps, level in-feed sensors on receiving hoppers (and even on the belts themselves), and sensors to measure the density of incoming and outgoing media.

All of these measurements help the machine manufacturers, and in turn the plant operators, optimise performance and control of the process. Really good automated control equals less power consumption, and a better quality output – every gram of water extracted means less energy in transport, further processing or drying. When that gets scaled up to a large wastewater network, or large-scale industrial process in food, paper, steel or aggregates, these grams of water can quickly turn into significant savings across a fleet of machines and sites.

The issues that come up when attempting to secure reliable control are manifold. If we look at the instrumentation that provides the data, accurate measurements provide benefits that are once again amplified through the cost and processing chain of this continuous, automated process.

Challenges of pressure control
One critical element is the pressure sensors used to monitor performance in delivery systems, pumps, recirculation or outfeed circuits. Pressure monitoring sensors have to be sensitive (and must be kept away from direct contact with the arduous process conditions). Yet ideally they need to be flush-mounted in pipelines and vessels to avoid blockages in the sludge they are monitoring.

Both types of constraint can lead to errors. For example, recessed protected sensors become prone to blockages or air pockets. These can be worked around with air or water flushing, but this adds cost and complexity and maybe unwanted water into the process. On the other hand, by flush mounting them, you expose the sensitive, thin-metal pressure diaphragms which can be prone to abrasion damage from particles in sludge. The result? Drift in the accuracy of readings, which, when trying to extract every gram of water or solids, can negate all that optimisation work.

So care needs to be taken in the mounting and positioning of pressure sensors. Air pockets and blockages in small recesses can play havoc with readings, and, in turn, necessitate extra servicing, recalibration and repair, which can introduce costly downtime and interruptions. Unfortunately, the outcome can be overflowing or over-running machines, often attributed to a lack of accurate, proper control. Machines are often switched to “run in manual” mode, to counter this. However, there is another solution when it comes to making these pressure measurements.

Flush mounting without fear?
The problems of blocking, flushing, cleaning, wear and damage can all be avoided with something as simple as using ceramic pressure sensors that can be fully flush-mounted. A concealed elastomer O-ring can reduce exposure to wear.

This ceramic diaphragm technology is already proven in slurry pipes in both the mining and recycled paper industries, with their respective bombardment of equipment with either sand/grit or ‘paperclips’ – all fatal to metal diaphragm devices. “Flush mounting without fear” using ceramic pressure transmitters also means no mis-reads as a result of blockages or air pockets due to syphon/vacuum effects from flows across recessed pipes or fittings. Another benefit of ceramic cells is a vastly improved resistance to pressure-shock overload and ‘water hammer’, once again providing better long-term accuracy and stability.

5B_0514

Ceramic flush pressure sensors provide hard wearing diaphragms to avoid blockages and provide accurate pressure monitoring.

Some ‘dynamic’ thickener systems, using pressure or vacuum, also require a differential pressure measurement across the filters or pumping systems to detect any drop in performance. This information is invaluable, as again it can be used to directly control speeds of filter mesh or flow rate, to avoid overflows and flooding. This can also be achieved with the use of two ceramic-faced sensors, in a remote electronic differential pressure configuration, which again provides a direct flush-mounted measurement.

Measuring density
Density is another parameter monitored during processing, normally via in-line density sensors. Again, they are vulnerable to build-up and wear. An alternative or back-up option is a flush-mounted or top-mounted ceramic-cell differential pressure system, which can be used to provide a continuous density measurement in vessels at certain process points. These often simple and innovative pressure-measurement solutions can be employed to maintain the correct feed rates and quality of output, and to help maintain process reliability.

Keeping level control
When it comes to level controls in sludge handling and feed applications, conductivity-based point-level switches and ultrasonic level sensors are the incumbent technologies, and a familiar fixture in the water industry. These sensors are often required to work within a confined operational space with a close proximity to splashing, sticky sludge. This presents challenges for ultrasonic sensors such as echo loss or ‘blocking distances or dead zones’ in the near-range. Similarly, conductivity-based point level switch probes encounter locked-up switch signals.

The result can be machine reliability issues, nearly always caused by build-up and sensor fouling, which can lead to automated processes malfunctioning. A residue build-up on a conductivity probe will often lead to a device remaining in a switched state until someone is called out to site and the probes are cleaned.

Maintenance call-outs for this kind of ‘nuisance false high level’ are costly and a strain on maintenance teams. The machines can often overflow or flood, resulting in production down-time, in addition to time-consuming, unpleasant clean-ups, and even damage to equipment. When sensors are working at the limits of their capabilities, there are often unsuccessful attempted ‘work-arounds’ and the upshot is a loss of confidence in automated control systems. The thickeners are run in manual or semi-manual mode, and what should be virtually continuous and automated, becomes a batch-operation with personnel devoting time to supervising the thickeners, where they could be spending it elsewhere on plant.

5B_0514
Radar level sensors are both affordable and extremely tolerant of residue build-up on the sensor face – an issue that compromises the performance of other technologies.

Radar-based sensors overcome many of these issues in level monitoring and control. They can work in more confined spaces, without ‘dead zone’ or ‘blocking distance’, and can operate comfortably with build-up and water deposits on the sensor face – with no loss in the accuracy of readings and the reliability of control.

In conjunction with a controller they can also be used for ‘non-contact, point-level control’, to replace probes. Until now, they were perceived as expensive, but new compact 80-GHz devices are both highly affordable and competitive with ultrasonic and even point-level switches.

Extra benefits like Bluetooth (for remote adjustment and transfer of operational information), means they can be monitored in real-time, from a safe position, to ensure they are delivering the performance needed.

If a separate, discrete back-up point-level device is needed, another alternative is an admittance-based level-probe solution. This adds expense compared to a conductivity switch, but can deliver much better reliability with its self-adjusting technology that avoids the ill effects of residue build-up.

5B_0514
Admittance-based probes provide a resilient solution and self-adjustment to compensate for residue build-up.

Every application is different, but many of the challenges are similar. Whether you are a machine- or system-supplier, or an end user with existing equipment, you might want to consult with a good level and pressure supplier. Be prepared to collaborate and innovate with them, and even to do some trials and tests at problem sites. You could be surprised at the cost effective improvements to yield, quality and efficiency.

]]>
Study proves bits of DNA in seawater correlate to the weight of netted fish, say researchers https://envirotecmagazine.com/2020/12/03/study-proves-bits-of-dna-in-seawater-correlate-to-the-weight-of-netted-fish-say-researchers/ Thu, 03 Dec 2020 17:00:30 +0000 https://envirotecmagazine.com/?p=256074 New tool will help census oceans from surface to seafloor, monitor fish, track shifting marine life due to climate change, around coral reefs, aquaculture or wind farms, oil rigs, and more

sorting table

Humanity is seemingly a step closer to answering one of the most ancient of questions – “how many fish in the sea?” – thanks to research that appears to show that the amount of fish DNA collected in a water sample closely corresponds to kilos of fish captured in a trawl with nets.

The new study reports that floating bits of DNA found in small water samples reveal the relative biomass of fish in the sea roughly as well as a “gold standard” US state government trawl with nets.

The researchers drew seawater samples during New Jersey government fish trawls and tested the water for fish DNA. Analysis of the water was able to reveal the relative abundance of fish with a 70% match in results between the two sampling methods. In addition to the great concordance between methods, the study found that each sampling method yielded information missed by the other.

While environmental DNA (“eDNA”) has been proven before as a reliable way to determine the variety of fish in an area of water, the new study is the first to show that bits of eDNA floating in seawater also disclose the relative abundance of the species swimming through it.

Published by the prestigious ICES Journal of Marine Science, the paper certifies “fishing for DNA” as an inexpensive, harmless complement to nets, acoustics and other established ways to monitor the health of fish stocks and/or the shifting diversity, distribution and abundance of aquatic life.

The paper, a collaboration between The Rockefeller University, Monmouth University, and the New Jersey Bureau of Marine Fisheries, says the information about the diversity and relative abundance of fish available in a one-litre sample is comparable to a 66 million litre trawl sweep, enough seawater to fill a football stadium to the top of the goalposts.

During four voyages by the New Jersey Ocean Trawl Survey in 2019 aboard the research vessel “Sea Wolf,” scientists led by Dr. Mark Stoeckle, Senior Research Associate at The Rockefeller University Program for the Human Environment, drew one-litre pop-bottle sized water samples from various depths just before the trawler’s nets were lowered.

Monmouth University student Skyler Post stores water and eDNA samples.

Implications for fisheries conservation

The finding has profound implications for improving global fisheries management and has led to early proposals for a “Great American Fish Count” in rivers and coastal waters, aided by millions of citizen scientists, comparable to Audubon’s Great Backyard Bird Counts.

Fish and other organisms shed DNA like dandruff, Dr. Stoeckle said, leaving an invisible trail wherever they swim. This environmental DNA can be skin cells, droppings, urine, eggs, and other biological residues that last in the ocean for a few days.

The eDNA process is described as straightforward and extremely inexpensive compared with traditional marine life monitoring methods, which involve ships with large crews and hand counts.

Co-author Zachary Charlop-Powers at The Rockefeller University, lead developer of the software used in the DNA analyses, said that eDNA testing involves collecting and filtering a water sample, extracting and sequencing the DNA in a laboratory, then matching the results found in an online DNA reference library.

“The bioinformatic tools used by the team are the same ‘barcode’ analysis pipelines commonly used by microbiologists but were adapted for the study of marine vertebrates.”

He added that the year of sampling and DNA extraction required an investment of just $12,000, exclusive of salaries.

“The applications of environmental DNA in the marine realm are vast,” said Dr Stoeckle, a Harvard-educated MD who helped develop DNA “barcoding,” the identification of species from a small region of the animal’s DNA sequence.

“eDNA offers a low-cost way to monitor the effectiveness of a marine protected area, for example, or whether efforts to restore a coral reef are succeeding. It could reveal the ecological effects of marine industrial activities, including offshore wind farms, oil and gas rigs, and commercial and recreational fishing.”

Adds Dr. Stoeckle: “To put this in perspective, if we thought of a trawl as a full medical CAT or MRI scan, then eDNA can be thought of as a pocket ultrasound–it can be carried and used anywhere in the hospital, without the time and expense of scheduling a full-scale exam. And eDNA surveys will become better and more informative every year as the technique improves and the DNA reference library grows.”

Says co-author Dr. Jason Adolf, Endowed Associate Professor of Marine Science, Monmouth University: “eDNA could also be used to identify life in ocean regions hard to access with trawls, such as very rocky areas, or places too deep or too shallow.”

Monmouth co-author Dr. Keith J. Dunton, an expert on endangered fish species, notes that the results are promising for rare as well as common fish species.

“eDNA along with other technologies like acoustic telemetry offers a sensitive, non-extractive way to monitor declines and revivals of rare, threatened, and endangered species,” he says. “We do not have to put them through stressful capturing to know that they are there.”

Trawl surveys, the main tool used to monitor fish populations, have carefully established protocols and yield rich information but are costly, time-consuming, and require special equipment and fish identification experts. Due to the crew size needed, such trawls have been limited recently by COVID-19.

The New Jersey surveys every season involve deploying a bottom trawl, similar to that used in commercial fishing, behind a vessel over a predetermined pattern. The catches in the nets are hauled up and sorted on tables where the weight of each identified species is recorded. Between 30 and 40 trawls are done about every three months.

To compare the trawl survey to the eDNA survey, one-litre water samples were collected at the surface and at depth before the trawls were done. However, samples were only taken before every fourth trawl. When the data from the two surveys were analyzed, the eDNA survey found most of the same fish species, and also found species not captured in the trawl. And it did so with only one-quarter of the samples taken and a fraction of the effort involved.

The paper says most (70% to 87%) species detected by trawl in a given month were also detected by eDNA, and vice versa, including nearly all (92% to 100%) abundant species. Conversely, most dropouts were relatively rare taxa.

In other comparisons, monthly eDNA species “reads” correlated with the monthly weight, or biomass, of that species recovered in the trawl.

The eDNA reporting “largely concorded with monthly trawl estimates of marine fish species richness, composition, seasonality, and relative abundance,” the paper says.

“It’s important to understand that the results of both methods are true, and complementary,” said Stoeckle. “They catch a lot of overlapping, concordant information as well as some information unique to each method.”

Gregory Hinks of the New Jersey Department of Environmental Protection, who co-authored the paper with Bureau of Marine Fisheries colleague Stacy M. VanMorter, adds: “During times like COVID when it is unsafe to conduct surveys with large crews, the eDNA method might allow us still to maintain some continuity in our surveys. In any case, piggybacking eDNA onto an existing survey may eventually provide an affordable way to improve marine fish stock assessment.”

Future research

The new paper lays out further research required, such as better calibration of eDNA “reads” to fish body mass – how much DNA is shed by 1,000 anchovies weighing 1 kilo, for example, compared with a one kilo sized sea bass? And how to account for eDNA reads that may be the result of injury due to a predator attack.

Since collecting water for eDNA is so quick and easy to do, research or oceanographic vessels and commercial and recreational vessels can collect samples as they travel from place to place. Even drones could be deployed to collect water samples.

And with the benefit of additional studies in marine and freshwaters, estimates of animal numbers using eDNA will continue to improve as well as the DNA reference data banks that allow reliable identification of aquatic species.

eDNA opens the way to surveys of unprecedented value, quality, and affordability, says Jesse Ausubel, Director of The Rockefeller University’s Program for the Human Environment, who developed and helped oversee the first international Census of Marine Life, a decadal (2000-2010) collaboration of about 2,700 scientists in 80 countries.

“eDNA makes the ocean a sea of biological information,” he says. “In the USA we could organize a Great American Fish Count in which millions of citizen scientists might collect water for eDNA testing spanning all our waters. Globally, the incipient UN Decade of the Oceans could include a Great Global Fish Count sampling from sea floor to sea surface and near shore to mid-ocean all during a single day or week.”

Tony MacDonald, Director of the Monmouth University Urban Coast Institute, said: “Our institute and scientists were excited to support this innovative work, one of several partnerships in recent years between UCI and The Rockefeller University Program for the Human Environment.”

“We hope to have the opportunity to continue and expand our collaboration with New Jersey’s Department of Environmental Protection Marine Fisheries and the National Oceanic and Atmospheric Administration on future fish trawls to further advance eDNA research.”

]]>