Science topics: MethodologySensors
Science topic
Sensors - Science topic
Ion selective electrodes role in new research
Questions related to Sensors
Dear all,
today I started my Velp Respirosoft system for the first time and everything is new for me. I want to analyze pig manure for BMP. I am wondering is everything well set because I am a bit worried of high pressure in bottles, especially to leave the system without the control during the night. How the system works? I know that KOH (or NaOH) neutralizes CO2, but what happens with methane? This is a closed system (anaerobic) and there is no way for methane to go out (leave) of the system and it accumulates in the bottle...
Please, if anyone works with this system, help :)
Just wondering if anyone has any experience in setting up a Getinge mini bioreactor? More specifically, autoclaving the reactor, the gel pH sensor, and the dO2 sensor.
I am having trouble understanding how to use the pH probe and "pressurize" in the autoclave before first use. The pH probe has a lifecycle of 10-15 autoclaves. Would I have to bathe the probe in ethanol as a method of sterilizing the probe between cell cultures?
Any advice is welcomed! Thanks so much in advance!
What is the effect of refractive index changes on the output wavelength of all-optical sensors?
Specifically, the advanced technologies such as sensors and internet of things.
Dear Sir/Madam,
I am not athor of this article. "Conference paper: Torque sensor based electrically assisted hybrid rickshaw-van with PV assistance and solar battery charging station".Please may you delite from my page?
Thank you.
2024 4th International Conference on Image Processing and Intelligent Control (IPIC 2024) will be held from May 10 to 12, 2024 in Kuala Lumpur, Malaysia.
Conference Webiste: https://ais.cn/u/ZBn2Yr
---Call For Papers---
The topics of interest for submission include, but are not limited to:
◕ Image Processing
- Image Enhancement and Recovery
- Target detection and tracking
- Image segmentation and labeling
- Feature extraction and image recognition
- Image compression and coding
......
◕ Intelligent Control
- Sensors in Intelligent Photovoltaic Systems
- Sensors and Laser Control Technology
- Optical Imaging and Image Processing in Intelligent Control
- Fiber optic sensing technology in the application of intelligent photoelectric system
......
All accepted papers will be published in conference proceedings, and submitted to EI Compendex, Inspec and Scopus for indexing.
Important Dates:
Full Paper Submission Date: April 19, 2024
Registration Deadline: May 3, 2024
Final Paper Submission Date: May 3, 2024
Conference Dates: May 10-12, 2024
For More Details please visit:
Invitation code: AISCONF
*Using the invitation code on submission system/registration can get priority review and feedback
2024 5th International Conference on Mechatronics Technology and Intelligent Manufacturing (ICMTIM 2024) will be held in Nanjing, China on April 26-28, 2024.
ICMTIM 2024 will be held once a year, aiming to bring scholars, experts, researchers and technicians in the academic fields of "mechatronics" and "intelligent manufacturing" together into an academic exchange platform, and provide a platform to share scientific research results, cutting-edge technologies, understand academic development trends, broaden research ideas, and strengthen academic research and discussion.
---Call For Papers---
The topics of interest for submission include, but are not limited to:
TRACK 1: Mechatronics Technology
· Mechatronics Control
· Sensors and Actuators
· 3D Printing Technologies
· Intelligent control
· Motion Control
......
TRACK 2:Intelligent Manufacturing
· Modeling and Design
· Intelligent Systems
· Intelligent mechatronics
· Micro-Machining Technology
· Sustainable Production
......
All papers, both invited and contributed, the accepted papers, will be published and submitted for inclusion into IEEE Xplore subject to meeting IEEE Xplore’s scope and quality requirements, and also submitted to EI Compendex and Scopus for indexing. All conference proceedings paper can not be less than 4 pages.
Important Dates:
Full Paper Submission Date: February 10, 2024
Registration Deadline: March 10, 2024
Final Paper Submission Date: March 25, 2024
Conference Dates: April 26-28, 2024
For More Details please visit:
2024 5th International Conference on Artificial Intelligence and Electromechanical Automation (AIEA 2024) will be held in Shenzhen, China, from June 14 to 16, 2024.
---Call For Papers---
The topics of interest for submission include, but are not limited to:
(1) Artificial Intelligence
- Intelligent Control
- Machine learning
- Modeling and identification
......
(2) Sensor
- Sensor/Actuator Systems
- Wireless Sensors and Sensor Networks
- Intelligent Sensor and Soft Sensor
......
(3) Control Theory And Application
- Control System Modeling
- Intelligent Optimization Algorithm and Application
- Man-Machine Interactions
......
(4) Material science and Technology in Manufacturing
- Artificial Material
- Forming and Joining
- Novel Material Fabrication
......
(5) Mechanic Manufacturing System and Automation
- Manufacturing Process Simulation
- CIMS and Manufacturing System
- Mechanical and Liquid Flow Dynamic
......
All accepted papers will be published in the Conference Proceedings, which will be submitted for indexing by EI Compendex, Scopus.
Important Dates:
Full Paper Submission Date: April 1, 2024
Registration Deadline: May 31, 2024
Final Paper Submission Date: May 14, 2024
Conference Dates: June 14-16, 2024
For More Details please visit:
Invitation code: AISCONF
*Using the invitation code on submission system/registration can get priority review and feedback
Hello everyone.
I'm working on a drying machine that work with vacuum substance and I have a lack of knowledge to put the sensors in their right places, I mean I can do it by making experience but I need something to rely on, something like standards or laws, if anyone is an expert or is able to help me, please reach me out I would really appreciate it.
this is my email: [email protected]
2024 3rd International Conference on Aerospace, Aerodynamics and Mechatronics Engineering (AAME 2024) will be held in Nanjing, China from April 12 to14, 2024.
AAME is an annual conference providing a yearly platform for delegates and members to present and discuss the latest research, and our delegates and members will have many opportunities engage in dialogues about Materials Science and Intelligent Manufacturing. It also provides new insights and bring together scholars, scientists, engineers and students from universities and industry all over the world under one roof.
We warmly invite you to participate in AAME 2024 and look forward to seeing you in AAME 2024!
---Call For Papers---
The topics of interest for submission include, but are not limited to:
- Rocket Theory and Design
- Avionics Engineering
- Communication Systems and Technologies
- New applications
- Higher frequencies and bandwidths
- Navigation and Precise Positioning
- UVA and MAV
- Aircraft navigation and positioning technology
- Radar detection and imaging technology
- Aviation navigation systems and new technologies
- Synthetic aperture radar technology
- Navigation guidance and control
- Analog and digital circuits
- Microelectronics manufacturing engineering signal processing
- Circuits and Systems
- Vacuum electronic technology
- Automatic Control Systems
- Sensors and Sensor Systems
- Aerospace Science and Technology
- Mechatronics Systems
- Electrical and electronic technology
- Microelectronic Technology Circuit analysis
All accepted full papers will be published in the conference proceedings by Journal of Physics: Conference Series (JPCS) (ISSN:1742-6596) and will be submitted to EI Compendex / Scopus for indexing.
Important Dates:
Registration Deadline: April 10, 2024
Final Paper Submission Date: April 08, 2024
Conference Dates: April 12-14, 2024
For More Details please visit:
Dear smart friends, I have a question about sonification. Can you advise me how I can convert human movement recorded by inertial sensors into sound? Is there any available program/application to download or purchase? Thank you very much.
I have not found any anti-iGluSnFR antibodies available, so I am curious if standard anti-GFP antibodies could recognize some epitope in the circularly permuted form of GFP in glutmate sensor iGluSnFR... Does anyone have some experience with it?
I want it to filter Fetal Heart Beat Sound obtained from the Piezoelectric sensor
What materials can be used in photonic crystal fiber sensors to withstand and measure high temperatures? (more than 100 degree centigrade)
At present, some researchers use machine learning to achieve one-to-one mapping of spectral information and response, and achieve one-to-one mapping of high sensitivity and wide measurement range. Does this method have drawbacks? How likely is it to work in practice?
I need your help to explain this issue, why in the recovery process the sensor can not reach its initial value?
I want to know the specifications of power sources like thin-film batteries or supercapacitors in self-powered sensors.
I am working on thin-film batteries and my potential application in self-powered sensors. That's why I want to know how much capacity, voltage, energy density, and power density are required for a self-powered sensor power source and its dimension limitation.
Can anyone please help me discover the details of such a power source? It will be beneficial to my current research or in my PhD work.
I have encountered an error to measure the light intensity of my laser source (650nm) (see image attached). The serial plot remains constant even i have changed the intensity of my light source, I have even tried both extremes: dark environment and close to lase source, yet there are no changes to the serial plot. Have anyone enconutered similar problem? How do i solve this error?
Here, the codes were used for the complete setup of photodiode BH1750 and Arduino Nano:
/*
Advanced BH1750 library usage example
This example has some comments about advanced usage features.
Connection:
VCC -> 3V3 or 5V
GND -> GND
SCL -> SCL (A5 on Arduino Uno, Leonardo, etc or 21 on Mega and Due, on esp8266 free selectable)
SDA -> SDA (A4 on Arduino Uno, Leonardo, etc or 20 on Mega and Due, on esp8266 free selectable)
ADD -> (not connected) or GND
ADD pin is used to set sensor I2C address. If it has voltage greater or equal to
0.7VCC voltage (e.g. you've connected it to VCC) the sensor address will be
0x5C. In other case (if ADD voltage less than 0.7 * VCC) the sensor address will
be 0x23 (by default).
*/
#include <Wire.h>
#include <BH1750.h>
/*
BH1750 can be physically configured to use two I2C addresses:
- 0x23 (most common) (if ADD pin had < 0.7VCC voltage)
- 0x5C (if ADD pin had > 0.7VCC voltage)
Library uses 0x23 address as default, but you can define any other address.
If you had troubles with default value - try to change it to 0x5C.
*/
BH1750 lightMeter(0x23);
void setup(){
Serial.begin(9600);
// Initialize the I2C bus (BH1750 library doesn't do this automatically)
Wire.begin();
// On esp8266 you can select SCL and SDA pins using Wire.begin(D4, D3);
/*
BH1750 has six different measurement modes. They are divided in two groups;
continuous and one-time measurements. In continuous mode, sensor continuously
measures lightness value. In one-time mode the sensor makes only one
measurement and then goes into Power Down mode.
Each mode, has three different precisions:
- Low Resolution Mode - (4 lx precision, 16ms measurement time)
- High Resolution Mode - (1 lx precision, 120ms measurement time)
- High Resolution Mode 2 - (0.5 lx precision, 120ms measurement time)
By default, the library uses Continuous High Resolution Mode, but you can
set any other mode, by passing it to BH1750.begin() or BH1750.configure()
functions.
[!] Remember, if you use One-Time mode, your sensor will go to Power Down
mode each time, when it completes a measurement and you've read it.
Full mode list:
BH1750_CONTINUOUS_LOW_RES_MODE
BH1750_CONTINUOUS_HIGH_RES_MODE (default)
BH1750_CONTINUOUS_HIGH_RES_MODE_2
BH1750_ONE_TIME_LOW_RES_MODE
BH1750_ONE_TIME_HIGH_RES_MODE
BH1750_ONE_TIME_HIGH_RES_MODE_2
*/
// begin returns a boolean that can be used to detect setup problems.
if (lightMeter.begin(BH1750::CONTINUOUS_HIGH_RES_MODE)) {
Serial.println(F("BH1750 Advanced begin"));
}
else {
Serial.println(F("Error initialising BH1750"));
}
}
void loop() {
float lux = lightMeter.readLightLevel();
Serial.print("Light: ");
Serial.print(lux);
Serial.println(" lx");
delay(1000);
}
Spiking of an analyte is performed to validate a sensor’s performance rather than using HPLC or ICP method to measure the concentration of the targeted analyte?
Nowadays, multiple areas of engineering use ultrasonic sensors, mainly due to their high precision within short distances and their robustness to electromagnetic interference. The well-known HC SR04 ultrasonic sensor generates ultrasonic waves at 40kHz frequency. While there is no shortage of information about its working principle, applications, and limitations, little is known about the energy density (i.e., intensity) of t ultrasonic pulses it generates. Could anyone provide me with such information? I will very much appreciate it!
Output power changes on the optical fiber sensor for hydrogen gas have been illustrated.
- Why does the Pd-Cu sensor show no sensitivity at low hydrogen percentages while the Pd sensor does?
- Why does the slope of sensitivity increase, especially in the 4 to 6% hydrogen range, more for the Pd-Cu sensor compared to Pd?
- Why does the Pd-Cu sensor exhibit non-linear behavior, especially in the 7 and 8% hydrogen percentages, while the Pd sensor is more linear?
Best regards,
What is the relationship between drain current and sensitivity in the FET-based sensor?
I don't understand the words that increase sensitivity when the drain current increases.
I'd like to ask for your advice.
I want to make a correlation of AOD 550nm (MODIS MCD19A2, 1km) with PM2.5 (non-referenced sensors: low-cost sensors). For this, I have downloaded MODIS MCD19A2 files and selected SDS optical_depth_055 during post-processing on LAADS Web.
- How can I collocate MODIS AOD data on the minimum pixel level (probably 3*3 pixel window) so that it can be correlated with the point sensor? Probably I need a Python script or ENVI manual for this.
- How can this MCD19A2 AOD550nm data be pre-processed for cloud masking, and QA for good-quality data?
- Mostly MAPPS Website (http://giovanni.gsfc.nasa.gov/aerostat/) is used for the validation of MODIS (Terra: MOD & Aqua: MYD) products with AERONET. However, there is no option for MCD (Aqua & Terra Combined). How can I validate MCD19A2 AOD 550nm with AERONET AOD.
Biosensors are developing day by day due to their applications. Each of these sensors is used in specific fields. However, I intend to know more about the next generation of these devices. Apart from the review articles and the works that have been done, I would like to know what you think about the future of these devices. For example, what features will they have? Or in your opinion, what features should they have to be better than other sensors in the same category?
Write me any ideas or comments you have about this. Thank you
I am searching for and trying to develop an interesting project topic for my MSc. thesis that relates to data acquisition, sensors, or anyone relating to robust or predictive control.
If we place a SPR sensor device between the electromagnet and varying the intensity of electric and magnetic field. How does sensor behaves especially when external electric field and evanescent field make resonance?
Explanation:
Only 57% of the expenses for urban trains and 40% for suburban trains have been recuperated, a fact that regular train travelers might have observed through ticket prices.
Despite these challenges, Indian Railways has achieved a remarkable milestone in the 2022-23 fiscal year, reporting a record revenue of ₹2.40 lakh crore. This represents a substantial increase of almost ₹49,000 crore compared to the previous year, as highlighted in a ministry statement on Monday. Notably, the freight revenue experienced a significant uptick, reaching ₹1.62 lakh crore, marking a robust growth of nearly 15% from the preceding year, as reported by NDTV on April 18, 2023.
Hello ResearchGate Community,
I am currently searching for a methane sensor with specific capabilities and would greatly appreciate any recommendations or insights from this knowledgeable community.
Requirements:
Application Context: We are working with Natural Gas-Fired Reciprocating Compressors and need to measure methane concentration in line with the exhaust pipe.
Measurement Range: The sensor must be capable of measuring methane concentrations up to 15,000 ppm at 191 Celsius (~375 F). Pressure is less than 20 psi.
Accuracy: +/- 5%
Output: 4-20 mA (preferred) or CANOpen
Hazardous Location: C1D1 or C1D2
Any insights into the operational challenges of hogh temperature methane sensors would be greatly appreciated.
Thank you in advance for your time and assistance!
Best regards,
Carlos Pena
GRA - Phd Candidate
+1-940-595-0047
University of Oklahoma
Sustainable Energy & Carbon Management Research Center
Norman, OK - USA
Does it make sense to evaluate the "sensor delay" when we have simulated a PCF temperature sensor in 2-D (in Comsol software)?
Is there a certain formula to evaluate it?
I am trying to explore data collection methods and protocols for transferring sensor data from the IoT devices to a local server.
We usually measure the current with a shunt or a Hall sensor. Compared to Hall sensors, measuring current with a shunt can keep good dynamic. However, the shunt has equivalent resistance. And the shunt resistance is usually high, e.g., 100Ω, for the weak current (uA~mA) measurement to make enough SNR, which may lead to very long stable time of the current during the measurement. So how can we measure the weak current with ultra low input impedance? Is there a reference circuit to realize it?
Description: I am working on a project with several IOT devices such as IP cameras and sensors, and I want to test its performance over a network. For this I am using NetSim simulator with emulation module. Can you let me know how to configure this? Thank you
I did a piezoelectric test with a digital oscilloscope and when I hit it with my hand only once I saw it rise and fall several times instead of just once. Immediately after the impact was stopped, the output was stable, but during the impact, it seemed as if there were several impacts. Does anyone know why? Why are multiple outputs recorded? And what is the solution?
The image on the left of each peak shows the increase in output during each hit. In the enlarged image on the right, you can see the image on the left that the voltage has reached its maximum and minimum in each stroke about 10-12 times.
Hello all, I am looking to simulate a heterogeneous Wireless Sensor Network (WSN) with energy harvesting devices for my research. The network should consist of at least two classes of wireless sensors: a set of low-energy normal sensors and another set of overlay sensors, which could be high-energy or preferably energy harvesting sensors. The data collected by these sensors should be transmitted to an external fusion node. My goal is to vary properties such as clustering and study network performance and lifetime. Currently, I am using the NetSim simulator, but I welcome directions on using any other simulator as well.
What are the sensors for crop health monitoring and crop monitoring and recommendation system using machine learning techniques?
Under what circumstances/application does one use Laser Vibrometer (which works under the principle of Doppler effect), Laser Triangulation Method and Laser Confocal Sensor. How does one determine which one is the best for a specifica application ? Also what is the difference when considering time to take one vibration measurement.
Experimental here i mean purely laboratory experimental trial , e.g A sensor is designed in a laboratory and the functionality verified using artificial sample . what risk of bias tool can be used for such a study ?
Hi
I'm trying to find a solution for a problem related to satellote temperature data.
I work with intertidal environment, and I have temperature sensores deployed in these environment for years. Data shows the temperature observed by these in situ sensors are quite diferent from the satellite.
The problem is, I intend to do mechanistic species distribution models, but my mechanistic data was based on data colected by the in situ sensors, and these only exist on a few places, whereas the satellite data is everywhere.
Assuming, my in situ sensors are the 'correct ones', is there a way to calibrate the satellite data for the places where I have no sensors according to the diferences observed between satellite and in situ sensors in the places I do have sensors?
Thank you.
Cheers, Luís Pereira.
Hello, I hope the readers find this question well. I would like to know about how can i find line pressure specification of a pressure sensor (Sensor brand that i want to use is Pressure sensor, but another example would be okay). I have some difficulty on finding the value of line pressure. Thank you for your help.
how can we differ shrimp and other objects in pond with sensors.
like Kinect Sensor which is used for humans i need it for shrimps in turbid water
What is the role of sensors in precision agriculture and components of IoT based agriculture monitoring system?
With this question, I am asking for volunteer assistance with an open-source project, LoRaBinaryFloodMessaging within https://github.com/SoothingMist/Scalable-Point-to-Point-LoRa-Sensor-Network/tree/main. Allow me to explain.
Many applications, including precision irrigation, require remote sensing of data generated by various pieces of equipment. The project at hand uses LoRa point-to-point flood messaging to accomplish data transfer. The complexities and costs of LoRaWAN and third-party services are avoided.
The present project phase accommodates cameras and single-value sensors. At issue is that the GUI accommodates only one camera and one sensor. What is needed is an improvement to the GUI so that cameras and sensors are selectable. Was thinking that drop-down lists would do well. However, front-end work is not my specialty. What I have found so far on this topic confuses me.
The GUI is driven by a basestation written in Python. Matplotlib is used to create and continuously update the GUI's two plots. The camera plot is not a picture really, just the display of a numpy matrix that is updated as image segments arrive. How would I add drop-down lists to the GUI? An example of the present state of the GUI is shown in the attached jpg. More detail is in the project’s documentation. Glad for any comments, suggestions, or direct help. Many thanks for your input.
Is precision agriculture also known as satellite farming and what are IoT sensors used in precision agriculture?
Dear Scholars,
Assume a mobile air pollution monitoring strategy using a network of sensors that move around the city, specifically a network of sensors that quantify PM2.5 at a height of 1.5 meters that lasts about 20 minutes. Clearly, using this strategy we would lose temporal resolution to gain spatial resolution.
If we would like to perform spatial interpolation to "fill" the empty spaces, what would you recommend? What do you think about it? What would be your approximations?
Regards
What sensors do drones use in agriculture and how are drones used in crop monitoring?
Which are the sensors can be used in agriculture in IoT and use of sensors in the field of automation and control?
What are the different types of sensors used in agriculture and application of sensors in the field of agriculture?
How are advanced fire safety technologies, such as early warning systems, intelligent suppression systems, and real-time data analytics, revolutionizing fire prevention, detection, and response in high-rise buildings and critical infrastructure, and what are the key challenges in integrating these technologies to ensure comprehensive and adequate fire safety measures?
Is there anyone who can help me in finding the appropriate template for this Elsevier journal?
Automotive manufacturers are increasingly utilizing artificial intelligence (AI) and machine learning techniques to enhance vehicle autonomy, safety, and overall driving experience in modern advanced driver assistance systems (ADAS) and autonomous vehicles. These technologies are revolutionizing the automotive industry by enabling vehicles to perceive their surroundings, make informed decisions, and interact with the environment more effectively. Here's how AI and machine learning are being utilized:
- Sensor Fusion and Perception: AI algorithms integrate data from various sensors, such as cameras, LiDAR (Light Detection and Ranging), radar, and ultrasonic sensors, to create a comprehensive and accurate perception of the vehicle's surroundings. Machine learning enables the system to learn and adapt to different driving scenarios, improving the accuracy of object detection, lane detection, and obstacle recognition.
- Autonomous Navigation and Path Planning: AI-based path planning algorithms use real-time sensor data and digital maps to plan safe and efficient routes for autonomous vehicles. Machine learning enables the system to consider dynamic factors like traffic conditions, road closures, and pedestrian behavior, ensuring smooth and safe navigation.
- Predictive Maintenance: AI and machine learning are used to analyze vehicle data to predict component failures and perform proactive maintenance, reducing downtime and enhancing vehicle reliability.
- Driver Monitoring and Behavior Analysis: AI-powered cameras and sensors inside the vehicle can monitor driver behavior, attention, and alertness. Machine learning algorithms can detect signs of drowsiness, distraction, or impairment, providing alerts or interventions to improve safety.
- Adaptive Cruise Control (ACC): AI is utilized in ACC systems to maintain a safe distance from the vehicle ahead. Machine learning models continuously learn and adapt to the driver's preferences and driving style.
- Lane Keeping and Lane Departure Warning: AI-based lane detection algorithms enable vehicles to stay within the lane, and machine learning helps in distinguishing intentional lane changes from unintended lane departures, triggering appropriate warnings if necessary.
- Advanced Collision Avoidance Systems: AI and machine learning techniques power advanced collision avoidance systems, which can autonomously apply brakes or take evasive maneuvers to prevent or mitigate collisions.
- Natural Language Processing (NLP) and Voice Commands: AI-powered NLP enables voice-based interaction with infotainment systems, navigation, and other in-car functionalities, improving the overall driving experience and reducing driver distractions.
- Data Security and Cybersecurity: AI is utilized to detect anomalies in-vehicle data and identify potential cybersecurity threats, protecting connected vehicles from cyber-attacks.
- Continuous Improvement and Over-the-Air Updates: AI-driven analytics enable automotive manufacturers to gather data from the vehicle fleet, monitor performance, and push over-the-air updates to improve algorithms, enhance features, and address safety concerns.
As AI and machine learning continue to evolve, automotive manufacturers will leverage these technologies to make autonomous driving safer, more reliable, and accessible to a broader range of vehicles, leading to transformative advancements in the automotive industry.
We have designed a 2D photonic crystal sensor using optiFDTD software and we want to find the different resonance wavelength by changing the analytes or samples at the center of the 2D photonic crystal structure, using the observation point at the output end. When we change the samples, resonance wavelength does not change. so what to do ?
Can anyone explain please.
Yours Sincerely
Jay
Literatures suggest that unlike non-ferromagnetic materials that lead to a decrease in impedance of the eddy current sensor, ferromagnetic materials lead to an increase in impedance of the coil. It is stated that this behavior is reversed at sensor frequencies > 1 MHz. Is it true?
1. In the practice of structural dynamics and control, non-minimum phase systems are common, and for the needs of motion and vibration control, we sometimes have to find a minimum phase one close to it. We can do this using structure modification、parameter adjustment, sensor position re-allocation, parallel compensator, series filter, output reconstruction, etc.
2. There seems to be a "critical" sytems in practice, for example, when adjusting the sensor position, when the sensor approaches the direction of the actuator, there will be a critical position, and when it approaches near, we obtain a minimum phase system.
3. Can we construct the concept of "the closest least phase system of the non-minimum phase system", just like the concept of "the closest linear system of the nonlinear system".
4. It is guessed that such a "closest least phase system" should (1) have a steady-state response and low-frequency response that is consistent or close to the original non-minimum phase system under closed loop; (2) Its inverse is a potential feed-forward controller of the original system; (3) Only minimal parameter adjustment is required; (4) Only minimum control effort is required.
5. If we could construct one somehow, it should be useful in theory and practice, at least, in the field of high-speed and high-precision motion control where I am interested in.
1、在结构与控制实践中,普遍存在非最小相位系统,出于控制的需要,我们往往要找到一个与之临近的最小相位系统。我们可以通过参数调整、传感器位置调整、并行补偿、串联补偿、输出重构等方式进行。
2、实践中似乎存在一个“临界”系统,比如调整传感器位置时,当传感器向作动器方向接近时,会有一个极限位置,小于此距离时我们会获得最小相位系统。
3、能否够构造一个“非最小相位系统的最近最小相位系统”的概念,就像“非线性系统的最近线性系统”的概念那样。
4、设想中,这样的“最近最小相位系统”应该①闭环下与原非最小相位系统一致或接近的稳态响应与低频段响应;②其逆是原系统的一个好用的前馈控制器;③仅需最小的参数调整获得;④仅需消耗最小的控制作用获得。
5、如果能有的话,这在理论与技术上应该很有用处,至少在我研究的高速高精运动控制上是这样。
I am working on a project with sensors which are able to measure agricultural levels of nitrogen/potassium/phosphor, but also moisture/temperature/ph level. The sensor measures in mg/kg but the farmers we are trying to help need it in kg/hectare. We got the following formula to convert but the outcome is not right, I do think I know why but not how to solve it.
Nitrogen in kg/hectare= (mg N/kg x depth of measurement in CM x density of the soil in G/cm3)/10
I think the outcome is not perfectly right because of the depth of measurement. Normal with hand tests which are sent to the lab, the depth of measurement is more accurate and matters more. But I do not have a clue how to correct the formule, does anyone have experience in this field and know how to fix this?
Thank you for reading.
Morris la Crois
I require assistance regarding this issue:
How does a change in length of an optical tapered fiber impact the interference between different modes? If an optical fiber sensor relies on mode interference, how does the sensor's performance change with variations in the tapered fiber's length? Are there any relevant formulas to address this concern?
keywords: taper, optical fiber, propagating modes, optical fiber sensors
Hello guys,
As far as I know, a shack-hartmann wavefront sensor can measure some optical systems, can it measure a doube-telecentric lens?
Currently I am working on SPR-PCF biosensor for biochemical detection , i am stuck in comsol multiphysics to find the sensitivity of SPR-PCF sensor how to do the analysis where to put the formula and how to check the results in comsol please someone suggest me the solution and please give the detailed explaination for that
I receive the received signal using an unmodulated continuous wave radar sensor and I'm struggling to obtain the relative distance of the object. I receive the IQ data in milli-volts and use the formula delta_phase = arctan2(Q_samples/I_samples), but the change in the phase as I plot is not that big although I keep on moving my hand back and forth from the sensor as I should see larger changes. The sensor is 61GHz and the wavelength is 4.92 mm. Any suggestion if I'm using the data in a wrong way or should I apply filtering before obtaining the phase change ??
Does anyone know of a fluorescent protein with an emission max greater than 720 nm? Which is the best fluorescent protein with an emission max greater than 700 nm for making a BRET sensor? I have many luciferase mutants which could serve as donors in the 610-630 range provided I find a good acceptor FP in the far red range.
Disposable medical sensors are easy-to-use and economical. Medical sensors, under which disposable medical sensors fall, are devices that aid in the detection of physical, biological, and chemical signals.
These devices offer a way for these signals to be recorded and measured.
When we receive the output of the sensor through the photodiode on the oscilloscope, it is in volts per pascal (sound pressure applied to the sensor), but to remove noises, etc., we need to convert this characteristic of the sensor into phase sensitivity.
How to convert mV/Pa to rad/Pa in the sensitivity of optical fiber sensors?
i am searching for new sensors that is used in cars for effective performance of motor.
can you give me a book or paper or introduce me a site?
i have tested a structure using accelerometer sensors, the data is attached herewith,
can you please check the data and please help me to find the actual sensor data from the recorded data.
The manufacturer states the noise spectral density as 45 micro g /(Hz)^0.5.
Can you suggest a filter to denoise the data?
the sampling rate is 100Hz
I need to cure large number of PDMS based samples for the fabrication of piezoresistive sensors. But I do not have hot air oven for this purpose. What are the other option to cure PDMS other then keeping it at room temperature for 48 hours?
As we know that the smaller the cell size of an imaging sensor (CCD array) the lower the amount of light that reaches the cells of the sensor. However, the state-of-the-art sensors' cell size has already reached the smallest level. Does this fact lead to the poor contrast quality of an image acquired with the imaging sensors?
I am currently working on a project called double rotary inverted pendulum that requires the use of sensors for accurate measurement with minimal delay and noise interference. I have come across two types of sensors, namely the analogue Hall Effect Potentiometer Angle Encoder Sensor and the digital Rotary Incremental Encoder.
In the context of accuracy, delay, and noise levels, I would greatly appreciate expert insight on which of these two sensors is more suitable for my project. Can someone provide guidance on the pros and cons of each sensor type in terms of accuracy, delay, and noise reduction?
Anyone having experience in designing rfid tag as a temperature sensor?
So we are given this problem:
A pressure sensor is specified to operate in the range of 0-100psia and provide a 4-20mA current loop output over this range. The pressure sensor current output is connected in the following configuration:
The load resistor used was measured as RL = 250Ω (±0.01%), and the resultant voltage VL was measured with an ADC with an error of ±0.5% FSO (10VDC).
Given that I have an equation:
P = 6250(V/R) - 25
That where I ask my question...
Is there a specific value (or benchmark) of sensitivity for the sensor to be considered as good in detecting gas?
When simulating a wireless sensor network in NetSim, what are the parameters to vary to increase/decrease the communication range of the sensors? How is the default range calculated? How can I modify the GUI grid size (environment) in proportion to the communication range?
How can we use sensors and other IoT devices to collect real-time data on soil moisture, temperature, and other environmental factors that affect crop growth?
Early leak detection in an offshore hydrocarbon pipeline is critical for ensuring the safety of personnel and the environment. However, it is challenging to achieve effective leak detection without false alarms, as a variety of factors can cause false alarms, such as changes in temperature, pressure, or flow rates, as well as instrument malfunction or even marine life interference.
Several technologies are available for early leak detection in offshore pipelines, including acoustic, thermal, and optical sensors, among others. Each technology has its strengths and limitations, and the most effective solution may depend on various factors, such as the pipeline location, operating conditions, and type of hydrocarbon being transported.
One approach to minimizing false alarms is to use multiple sensors and incorporate them into a comprehensive leak detection system that can analyze data from different sensors and cross-check the results to reduce false alarms. Additionally, regularly testing and maintaining the sensors and system can also help to minimize false alarms and ensure that the system is functioning effectively.
Ultimately, achieving effective early leak detection without false alarms in offshore hydrocarbon pipelines requires a combination of appropriate technology selection, system design, and regular maintenance and testing to ensure optimal performance.
Please elaborate on your opinion on it.
Generally, experimentalists apply thermal effects
(temperature) to determine the recovery time of a sensor, which
can theoretically be determined through transition state theory
by the following equation:
📷
Here v, Eads, K and T are the attempt frequency,
adsorption energy, Boltzmann constant and temperature, respectively.
How the attempt frequency of any material used as a sensor can be calculated theoretically? As in the articles, they only mention the reported values 10^-10 or 10^-12. Is this value constant?
Hello,
I am working with high-frequency (144 measurements/day) temperature logger data. It was two individual temp loggers measuring at equal intervals (10 mins) at the same stationary location for 120 days.
One sensor was calibrated, while the other was not. The uncalibrated sensor shows high drift after about a month of use. I am looking to determine the statistical significance between between the logger types (calibrated vs. uncalibrated).
I was wondering if there was a way to compare the mean daily values of loggers (calibrated vs. uncalibrated) and then determine at approximately what day did the daily means become statistical different (i.e. day 24 of 120).
To start, I was thinking of using a paired t-test. However, the data (>10,000 points for each dataset) are non-parametric. I am thinking with the large sample sizes a paired t-test will be sufficient.
Any and all advice is greatly appreciated!
I am doing thesis resaerch on ring resonator.
As technology advances, cables and classic electrodes will disappear, AI and sensors will take hold instead. New monitoring devices will have to be developed, they will be less invasive, smaller, more comfortable and performing in the same way.
Optical tweezers (OT) is a well tool for many research fields, especially in biology and physics, which can be used to manipulate tiny objects, measure weak forces, and sensor a local area. But, it seems not to be used in industry yet. So, my question is what the future of OT is.
Which temperature sensor is widely used and what do you understand by vertical and horizontal variation in temperature and how vertical temperature profiles obtained?
An IoT-based system uses ESP32, ESP8266, PZEM 004T, and other sensors, software, and smart devices to collect data on energy consumption, production, and distribution in a smart microgrid and process this data to provide a perception of energy use and optimize energy management.
Dear Researcher,
For measuring the response time of SPR-based sensors, Is there any specific formula or theoretical explanation for that? Need some references.
By whole-body I mean a single dynamic measurement that contains the health measures for a bridge. Not, for example where multiple vibration sensors are placed at locations and their data reintegrated at a later date.
How can precision agriculture technologies, such as drones and sensors, be used to optimize crop yield and reduce input costs?
Hello, I am from Argentina and I am an electronic engineer and PhD student working with resonators. I have made several PDMS chips for microfluidics but I don't know how to stick/glue them to the sensor in a non-permanent way. Also, fluid should not spill on the sensor (only on the gold sensing area). For that reason the PDMS chip is made to guide the liquid.
I hope for an answer !
I am working on a small wind turbine as part of my internship for my course. The rated capacity of the small wind turbine is 700 watts at a rated wind speed of (…) m/s. The turbine is installed on a (height of the pole) m- steel pole/ tower. Further, the turbine is connected to the local electricity grid.
Challenges faced: At 3 m/s- windspeed, I am observing that the connected sensors/ electronics are consuming 15W
My research is the development of electrochemical sensors such as glucose, lactate, and potentiometric ion sensors based on Au electrodes.
Au electrodes in the electrochemical application have nano-structures for the large electro-chemical active area. Unfortunately, electrodeposition of sensing materials is not uniform on the surface, resulting in low reproducibility and sensing performance. In an attempt to address this issue, I employed the CV method under 0.25M HCl to clean the Au surface. However, this approach did not yield the desired results.
Therefore, I am seeking your expert advice on how to obtain reliable Au electrodes that can offer consistent and accurate results. I would greatly appreciate any suggestions or recommendations you may have to improve the reproducibility of these sensors.
I have fabricated an impedimetric aptasensor for small molecule detection. The black line in the attached Nyquist plot is the EIS measurement before incubation with the target. After incubating with the target for 1 hour, the EIS measurement gives a lower value of Rct was observed (purple line). The target incubated sensor is then washed with deionized water and dried in a nitrogen stream. The EIS measurement was again performed and a higher Rct was observed (red line). (all measurement was performed in 5 mM FerriFerro and 0.1 M KCl). The aptamer is linked with C6 and the sensor is unblocked) Anyone can explain this phenomenon?
I once thought about the possibility of predicting earthquakes by implanting a special sensor chip in mice. But then I thought more: If this sensor chip is always in the mouse's body, it will affect the normal ability of the mouse. So good results cannot be obtained from it. What do you think about it?
AI to recognize cardiac arrhythmias like by using thermoscanner and other physiological sensors. New way and new product without the classic electrodes.
Dear friends,
I am going to desing an optical Fiber Bragg grating sensor for pressure and strain measurement using COMSOL software. But, I am new in this software so if anybody having any idea about this please help me out.
Thanks & regards
Nilakanta
There are sensor for urban air temperature (heat island effect study) at better quality/price than iButton? I don't need higher quality sensor than IButton, but mainly just T sensors at lower cost.. thanks!
I have data from a vibrating specimen in one direction with an input of 1G sine sweep. There are accelerometers in different locations. All data from these sensors in FFT has 2 close peaks. It is due to two modes are close to each other in frequency. But other than than, I am wondering if we can get more information from these peaks.
I am working on a project on, force feedback system in a 3 finger barot gripper,For that i need CMC sensor for integration with gripper system.
Hello!
I have collected muscle activity data with the Muscle Sensor v3 Kit. Now I would like to apply a machine learning algorithm to it. According to the datasheet for this sensor, it has already been amplified, rectified, and smoothed.
Would anyone be able to tell me if the data needs to be denoised before applying machine learning? Here's the data how it looks like after plotting.
Here's the data how it looks like after plotting:
I have a drive shaft connected to a torque sensor but to avoid damage I must constrict the linear motion to pure rotational motion of the shaft. Although the displacement of the shaft is a few millimeters, the force will be quite huge. How do I convert the linear and rotational motion of this shaft to pure rotational motion without sacrificing the torque output?
How to solve the problem if the sensor is not capable of transmitting data to base station in WSN. Is there any paper talking about this issue?