Memtransistors in which the source-drain channel conductance can be nonvolatilely manipulated through the gate signals have emerged as promising components for implementing neuromorphic computing.On the other side,it ...Memtransistors in which the source-drain channel conductance can be nonvolatilely manipulated through the gate signals have emerged as promising components for implementing neuromorphic computing.On the other side,it is known that the complementary metal-oxide-semiconductor(CMOS)field effect transistors have played the fundamental role in the modern integrated circuit technology.Therefore,will complementary memtransistors(CMT)also play such a role in the future neuromorphic circuits and chips?In this review,various types of materials and physical mechanisms for constructing CMT(how)are inspected with their merits and need-to-address challenges discussed.Then the unique properties(what)and poten-tial applications of CMT in different learning algorithms/scenarios of spiking neural networks(why)are reviewed,including super-vised rule,reinforcement one,dynamic vision with in-sensor computing,etc.Through exploiting the complementary structure-related novel functions,significant reduction of hardware consuming,enhancement of energy/efficiency ratio and other advan-tages have been gained,illustrating the alluring prospect of design technology co-optimization(DTCO)of CMT towards neuro-morphic computing.展开更多
The high demand for lung transplants cannot be matched by an adequate number of lungs from donors. Since fully ex-novo lungs are far from being feasible, tissue engineering is actively considering implantation of engi...The high demand for lung transplants cannot be matched by an adequate number of lungs from donors. Since fully ex-novo lungs are far from being feasible, tissue engineering is actively considering implantation of engineered lungs where the devitalized structure of a donor is used as scaffold to be repopulated by stem cells of the receiving patient. A decellularized donated lung is treated inside a bioreactor where transport through the tracheobronchial tree (TBT) will allow for both deposition of stem cells and nourishment for their subsequent growth, thus developing new lung tissue. The key concern is to set optimally the boundary conditions to utilize in the bioreactor. We propose a predictive model of slow liquid ventilation, which combines a one-dimensional (1-D) mathematical model of the TBT and a solute deposition model strongly dependent on fluid velocity across the tree. With it, we were able to track and drive the concentration of a generic solute across the airways, looking for its optimal distribution. This was given by properly adjusting the pumps’ regime serving the bioreactor. A feedback system, created by coupling the two models, allowed us to derive the optimal pattern. The TBT model can be easily invertible, thus yielding a straightforward flow/pressure law at the inlet to optimize the efficiency of the bioreactor.展开更多
Sorafenib is an effective anti-angiogenic treatment forhepatocellular carcinoma(HCC). The assessment of tumor progression in patients treated with sorafenib is crucial to help identify potentially-resistant patients,a...Sorafenib is an effective anti-angiogenic treatment forhepatocellular carcinoma(HCC). The assessment of tumor progression in patients treated with sorafenib is crucial to help identify potentially-resistant patients,avoiding unnecessary toxicities. Traditional methods to assess tumor progression are based on variations in tumor size and provide unreliable results in patients treated with sorafenib. New methods to assess tumor progression such as the modified Response Evaluation Criteria in Solid Tumors or European Association for the Study of Liver criteria are based on imaging to measure the vascularization and tumor volume(viable or necrotic). These however fail especially when the tumor response results in irregular development of necrotic tissue. Newer assessment techniques focus on the evaluation of tumor volume,density or perfusion. Perfusion computed tomography and Dynamic ContrastEnhanced-UltraS ound can measure the vascularization of HCC lesions and help predict tumor response to antiangiogenic therapies. Mean Transit Time is a possible predictive biomarker to measure tumor response. Volumetric techniques are reliable,reproducible and time-efficient and can help measure minimal changes in viable tumor or necrotic tissue,allowing the prompt identification of non-responders. Volume ratio may be a reproducible biomarker for tumor response. Larger trials are needed to confirm the use of these techniques in the prediction of response to sorafenib.展开更多
This research was focused on the valorisation of glycerol,exploring the feasibility of an efficient route for oxygenated additives production based on its etherification with bio-butanol.A home-made BEA zeolite sample...This research was focused on the valorisation of glycerol,exploring the feasibility of an efficient route for oxygenated additives production based on its etherification with bio-butanol.A home-made BEA zeolite sample with a tuneable acidity has been proposed as the catalytic system,being tested in a stirred reactor under different etherification conditions.Although a reaction temperature as high as 200℃resulted to be beneficial in terms of glycerol conversion(-90%),only by operating at milder conditions the product selectivity to glycerol-ethers can be better controlled,in order to obtain a bio-fuel complying with the requirements for mixing with fossil diesel or biodiesel,without any need of purification from large amount of by-products.A comprehensive identification of all the compounds formed during the reaction was performed by a GC-MS analysis,on the basis of the complex network of consecutive and parallel reaction paths leading not only to the desired ethers,but also to many side products not detected in similar acid-catalyzed reactions in liquid phase and not available in the most used mass-spectra libraries.展开更多
Current IT society is populating the planet with a plethora of applications at an unprecedented rate pushed by advances in fabrication, mechatronics and communication technologies on one hand and availability of sophi...Current IT society is populating the planet with a plethora of applications at an unprecedented rate pushed by advances in fabrication, mechatronics and communication technologies on one hand and availability of sophisticated sensor and actuator devices on the other[1] . Applications, that both target everyday life, industrial and mission critical needs, are either confined at a single target device or distributed within a net- work of units, by also taking advantage of seamless commu-nication capabilities and address different application scenarios. Examples of this trend are the Internet-of-Things, smart-whatsoever (home, grid, building, city, planet) and cyber-physical systems[2].展开更多
The present paper aims to describe the conceptual idea to use cars as sensors to measure and acquire data related road environment. The parameters are collected using only standard equipment commonly installed and ope...The present paper aims to describe the conceptual idea to use cars as sensors to measure and acquire data related road environment. The parameters are collected using only standard equipment commonly installed and operative on commercial cars. Real sensors and car sub-systems (e.g. thermometers, accelerometers, ABS, ESP, and GPS) together with other “implicit” sensors (e.g. fog lights, windscreen wipers) acquire and contain information. They are shared inside an in-vehicle communication network using mainly the standard CAN bus and can be collected by a simple central node. This node can also be available on the market without too expensive costs thanks to some companies which business is devoted to car fleet monitoring. All the collected data are then geolocalized using a standard GPS receiver and sent to a remote elaboration unit, exploiting mobile network technologies such as GPRS or UMTS. A large number of cars, connected together in a diffuse Wireless Sensor Network, allow the elaboration unit to realize some info-layers put at the disposal of a car driver. Traffic, state of the road and other information about the weather can be received by car drivers using an ad hoc developed mobile application for smartphone which can give punctual information related to a specific route, previously set on the mobile phone navigator. The description of some experimental activities is presented, some technical points will be addressed and some examples of applications of the network of cars “as sensors” will be given.展开更多
Cardiovascular disease (CVD) risk assessment is an important instrument to enhance the clinical decision in the daily practice as well as to improve the preventive health care promoting the transfer from the hospital ...Cardiovascular disease (CVD) risk assessment is an important instrument to enhance the clinical decision in the daily practice as well as to improve the preventive health care promoting the transfer from the hospital to patient’s home. Due to its importance, clinical guidelines recommend the use of risk scores to predict the risk of a cardiovascular disease event. Therefore, there are several well known risk assessment tools, unfortunately they present some limitations.This work addresses this problem with two different methodologies:1) combination of risk assessment tools based on fusion of Bayesian classifiers complemented with genetic algorithm optimization;2) personalization of risk assessment through the creation of groups of patients that maximize the performance of each risk assessment tool. This last approach is implemented based on subtractive clustering applied to a reduced-dimension space.Both methodologies were developed to short-term CVD risk prediction for patients with Acute Coronary Syndromes without ST segment eleva-tion (ACS-NSTEMI). Two different real patients’ datasets were considered to validate the developed strategies:1) Santa Cruz Hospital, Portugal, N=460 patients;2)LeiriaPombal Hospital Centre, Portugal, N=99 patients.This work improved the performance in relation to current risk assessment tools reaching maximum values of sensitivity, specificity and geometric mean of, respectively, 80.0%, 82.9%, 81.5%. Besides this enhancement, the proposed methodologies allow the incorporation of new risk factors, deal with missing risk factors and avoid the selection of a single tool to be applied in the daily clinical practice. In spite of these achievements, the CVD risk assessment (patient stratification) should be improved. The incorporation of new risk factors recognized as clinically significant, namely parameters derived from heart rate variability (HRV), is introduced in this work. HRV is a strong and independent predictor of mortality in patients following acute myocardial infarction. The impact of HRV parameters in the characterization of coronary artery disease (CAD) patients will be conducted during hospitalization of these patients in the Leiria-Pombal Hospital Centre (LPHC).展开更多
We study the short-term memory capacity of ancient readers of the original New Testament written in Greek, of its translations to Latin and to modern languages. To model it, we consider the number of words between any...We study the short-term memory capacity of ancient readers of the original New Testament written in Greek, of its translations to Latin and to modern languages. To model it, we consider the number of words between any two contiguous interpunctions I<sub>p</sub>, because this parameter can model how the human mind memorizes “chunks” of information. Since I<sub>P</sub> can be calculated for any alphabetical text, we can perform experiments—otherwise impossible— with ancient readers by studying the literary works they used to read. The “experiments” compare the I<sub>P</sub> of texts of a language/translation to those of another language/translation by measuring the minimum average probability of finding joint readers (those who can read both texts because of similar short-term memory capacity) and by defining an “overlap index”. We also define the population of universal readers, people who can read any New Testament text in any language. Future work is vast, with many research tracks, because alphabetical literatures are very large and allow many experiments, such as comparing authors, translations or even texts written by artificial intelligence tools.展开更多
In this paper, a data-based fault tolerant control(FTC) scheme is investigated for unknown continuous-time(CT)affine nonlinear systems with actuator faults. First, a neural network(NN) identifier based on particle swa...In this paper, a data-based fault tolerant control(FTC) scheme is investigated for unknown continuous-time(CT)affine nonlinear systems with actuator faults. First, a neural network(NN) identifier based on particle swarm optimization(PSO) is constructed to model the unknown system dynamics. By utilizing the estimated system states, the particle swarm optimized critic neural network(PSOCNN) is employed to solve the Hamilton-Jacobi-Bellman equation(HJBE) more efficiently.Then, a data-based FTC scheme, which consists of the NN identifier and the fault compensator, is proposed to achieve actuator fault tolerance. The stability of the closed-loop system under actuator faults is guaranteed by the Lyapunov stability theorem. Finally, simulations are provided to demonstrate the effectiveness of the developed method.展开更多
Nanocarbon materials play a critical role in the development of new or improved technologies and devices for sustainable production and use of renewable energy. This perspective paper defines some of the trends and ou...Nanocarbon materials play a critical role in the development of new or improved technologies and devices for sustainable production and use of renewable energy. This perspective paper defines some of the trends and outlooks in this exciting area, with the effort of evidencing some of the possibilities offered from the growing level of knowledge, as testified from the exponentially rising number of publications, and putting bases for a more rational design of these nanomaterials. The basic members of the new carbon family are fullerene, graphene, and carbon nanotube. Derived from them are carbon quantum dots, nanohorn, nanofiber, nano ribbon, nanocapsulate, nanocage and other nanomorphologies. Second generation nanocarbons are those which have been modified by surface functionalization or doping with heteroatoms to create specific tailored properties. The third generation of nanocarbons is the nanoarchitectured supramolecular hybrids or composites of the first and second genera- tion nanocarbons, or with organic or inorganic species. The advantages of the new carbon materials, relating to the field of sustainable energy, are discussed, evidencing the unique properties that they offer for developing next generation solar devices and energy storage solutions.展开更多
Recent advances on the use of nanocarbon-based electrodes for the electrocatalytic conversion of gaseous streams of CO2 to liquid fuels are discussed in this perspective paper. A novel gas-phase electrocatalytic cell,...Recent advances on the use of nanocarbon-based electrodes for the electrocatalytic conversion of gaseous streams of CO2 to liquid fuels are discussed in this perspective paper. A novel gas-phase electrocatalytic cell, different from the typical electrochemical systems working in liquid phase, was developed. There are several advantages to work in gas phase, e.g. no need to recover the products from a liquid phase and no problems of CO2 solubility, etc. Operating under these conditions and using electrodes based on metal nanoparticles supported over carbon nanotube (CNT) type materials, long C-chain products (in particular isopropanol under optimized conditions, but also hydrocarbons up to C8-C9) were obtained from the reduction of CO2. Pt-CNT are more stable and give in some cases a higher productivity, but Fe-CNT, particular using N-doped carbon nanotubes, give excellent properties and are preferable to noble-metal-based electrocatalysts for the lower cost. The control of the localization of metal particles at the inner or outer surface of CNT is an importact factor for the product distribution. The nature of the nanocarbon substrate also plays a relevant role in enhancing the productivity and tuning the selectivity towards long C-chain products. The electrodes for the electrocatalytic conversion of CO2 are part of a photoelectrocatalytic (PEC) solar cell concept, aimed to develop knowledge for the new generation artificial leaf-type solar cells which can use sunlight and water to convert CO2 to fuels and chemicals. The CO2 reduction to liquid fuels by solar energy is a good attempt to introduce renewables into the existing energy and chemical infrastructures, having a higher energy density and easier transport/storage than other competing solutions (i.e. H2).展开更多
Statistics of languages are usually calculated by counting characters, words, sentences, word rankings. Some of these random variables are also the main “ingredients” of classical readability formulae. Revisiting th...Statistics of languages are usually calculated by counting characters, words, sentences, word rankings. Some of these random variables are also the main “ingredients” of classical readability formulae. Revisiting the readability formula of Italian, known as GULPEASE, shows that of the two terms that determine the readability index G—the semantic index , proportional to the number of characters per word, and the syntactic index GF, proportional to the reciprocal of the number of words per sentence—GF is dominant because GC is, in practice, constant for any author throughout seven centuries of Italian Literature. Each author can modulate the length of sentences more freely than he can do with the length of words, and in different ways from author to author. For any author, any couple of text variables can be modelled by a linear relationship y = mx, but with different slope m from author to author, except for the relationship between characters and words, which is unique for all. The most important relationship found in the paper is that between the short-term memory capacity, described by Miller’s “7 ? 2 law” (i.e., the number of “chunks” that an average person can hold in the short-term memory ranges from 5 to 9), and the word interval, a new random variable defined as the average number of words between two successive punctuation marks. The word interval can be converted into a time interval through the average reading speed. The word interval spreads in the same range as Miller’s law, and the time interval is spread in the same range of short-term memory response times. The connection between the word interval (and time interval) and short-term memory appears, at least empirically, justified and natural, however, to be further investigated. Technical and scientific writings (papers, essays, etc.) ask more to their readers because words are on the average longer, the readability index G is lower, word and time intervals are longer. Future work done on ancient languages, such as the classical Greek and Latin Literatures (or modern languages Literatures), could bring us an insight into the short-term memory required to their well-educated ancient readers.展开更多
Photovoltaic (PV) systems have attracted increasing attention in last years as well as Wireless Sensor Networks (WSNs), which have been used in many application fields. In PV plants, especially in ground installations...Photovoltaic (PV) systems have attracted increasing attention in last years as well as Wireless Sensor Networks (WSNs), which have been used in many application fields. In PV plants, especially in ground installations, a lot of thefts and damages occur due to the still high cost of the modules. A new experimental WSN ad-hoc has been designed to be an anti-theft alarm system. Each node of the network is directly installed under each PV string and it is equipped with an accelerometer sensor capable to detect a minimum displacement of the panel from its steady position. The WSN presents a star topology: a master node cyclically interrogates the slave nodes through RF link. It collects all the nodes responses and communicates though a RS-232 interface with a control PC checking the network status. When a slave node detects an alarm, continuous messages are sent to the control PC which turns on all the alarm signaling systems. The control PC is equipped with an open source operative system and software and provides for SMS, e-mail and sound-light signaling in case of alarm. It also communicates with a remote server where all the WSN information is stored. A first low cost experimental WSN has been already installed and it is working properly.展开更多
We propose the first statistical theory of language translation based on communication theory. The theory is based on New Testament translations from Greek to Latin and to other 35 modern languages. In a text translat...We propose the first statistical theory of language translation based on communication theory. The theory is based on New Testament translations from Greek to Latin and to other 35 modern languages. In a text translated into another language</span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">,</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"> all linguistic variables do numerically change. To study the chaotic data that emerge, we model any translation as a complex communication channel affected by “noise”, studied according to Communication Theory applied for the first time to this channel. This theory deals with aspects of languages more complex than those currently considered in machine translations. The input language is the “signal”, the output language is a “replica” of the input language, but largely perturbed by noise, indispensable, however, for conveying the meaning of the input language to its readers</span></span></span><span><span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><b><span style="font-family: Verdana;" cambria="" math","serif";"="">.</span></b></span></span><span style="font-family:""></span><span><span><span style="font-family:""><span style="font-family:Verdana;"> </span><span style="font-family:Verdana;">We have defined a noise-to-signal power ratio and found that channels are differently affected by translation noise. Communication channels are also characterized by channel capacity. The translation of novels has more constraints than the New Testament translations. We propose a global readability formula for alphabetical languages, not available for most of them, and conclude with a general theory of language translation which shows that direct and reverse channels are not symmetric. The general theory can also be applied to channels of texts belonging to the same language both to study how texts of the same author may have changed over time, or to compare texts of different authors. In conclusion, a common underlying mathematical structure governing human textual/verbal communication channels seems to emerge. Language does not play the only role in translation;this role is shared with reader’s reading ability and short-term</span></span></span></span><span><span><span style="font-family:""> </span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">memory capacity. Different versions of New Testament within the same language can even seem, mathematically, to belong to different languages. These conclusions are everlasting because valid also for ancient Roman and Greek readers.展开更多
Quantitative precipitation estimation and rainfall monitoring based on meteorological data, potentially provides continuous, high-resolution and large-coverage data, are of high practical use: Think of hydrogeological...Quantitative precipitation estimation and rainfall monitoring based on meteorological data, potentially provides continuous, high-resolution and large-coverage data, are of high practical use: Think of hydrogeological risk management, hydroelectric power, road and tourism. Both conventional long-range radars and rain-gauges suffer from measurement errors and difficulties in precipitation estimation. For efficient monitoring operation of localized rain events of limited extension and of small basins of interest, an unrealistic extremely dense rain gauge network should be needed. Alternatively C-band or S-band meteorological long range radars are able to monitor rain fields over wide areas, however with not enough space and time resolution, and with high purchase and maintenance costs. Short-range X-band radars for rain monitoring can be a valid compromise solution between the two more common rain measurement and observation instruments. Lots of scientific efforts have already focused on radar-gauge adjustment and quantitative precipitation estimation in order to improve the radar measurement techniques. After some considerations about long range radars and gauge network, this paper presents instead some examples of how X-band mini radars can be very useful for the observation of rainfall events and how they can integrate and supplement long range radars and rain gauge networks. Three case studies are presented: A very localized and intense event, a rainfall event with high temporal and spatial variability and the employ of X-band mini radar in a mountainous region with narrow valleys. The adaptability of such radar devoted to monitor rain is demonstrated.展开更多
Maria Valtorta (1897-1961, Italian mystic)—bedridden since 1934 because paralyzed—wrote in Italian 13,193 pages of 122 school notebooks concerning alleged mystical visions on Jesus’ life, during World War II and fe...Maria Valtorta (1897-1961, Italian mystic)—bedridden since 1934 because paralyzed—wrote in Italian 13,193 pages of 122 school notebooks concerning alleged mystical visions on Jesus’ life, during World War II and few following years. The contents—about 2.64 million words—are now scattered in different books. She could write from 2 to 6 hours without pausing, with steady speed, and twice in the same day. She never made corrections and was very proficient in Italian. We have studied her writing activity concerning her alleged mystical experience with the main scope of establishing the time sequence of daily writing. This is possible because she diligently annotated the date of almost every text. We have reconstructed the time series of daily words and have converted them into time series of writing time, by assuming a realistic speed of 20 words per minute, a reliable average value of fast handwriting speed, applicable to Maria Valtorta. She wrote for 1340 days, about 3.67 years of equivalent contiguous writing time, mostly concentrated in the years 1943 to 1948. This study is a first approach in evaluating the effort done, in terms of writing time, by a mystic turned out to be a very effective literary author, whose texts are interesting to read per se, beyond any judgement—not of concern here—on her alleged visions.展开更多
In this paper, a data-driven prognostic model capable to deal with different sources of uncertainty is proposed. The main novelty factor is the application of a mathematical framework, namely a Random Fuzzy Variable (...In this paper, a data-driven prognostic model capable to deal with different sources of uncertainty is proposed. The main novelty factor is the application of a mathematical framework, namely a Random Fuzzy Variable (RFV) approach, for the representation and propagation of the different uncertainty sources affecting </span><span style="font-family:Verdana;">Prognostic Health Management (PHM) applications: measurement, future and model uncertainty. </span><span style="font-family:Verdana;">In this way, it is possible to deal not only with measurement noise and model parameters uncertainty due to the stochastic nature of the degradation process, but also with systematic effects, such as systematic errors in the measurement process, incomplete knowledge of the degradation process, subjective belief about model parameters. Furthermore, the low analytical complexity of the employed prognostic model allows to easily propagate the measurement and parameters uncertainty into the RUL forecast, with no need of extensive Monte Carlo loops, so that low requirements in terms of computation power are needed. The model has been applied to two real application cases, showing high accuracy output, resulting in a potential</span></span><span style="font-family:Verdana;">ly</span><span style="font-family:Verdana;"> effective tool for predictive maintenance in different industrial sectors.展开更多
The statistical theory of language translation is used to compare how a literary character speaks to different audiences by diversifying two important linguistic communication channels: the “sentences channel” and t...The statistical theory of language translation is used to compare how a literary character speaks to different audiences by diversifying two important linguistic communication channels: the “sentences channel” and the “interpunctions channel”. The theory can “measure” how the author shapes a character speaking to different audiences, by modulating deep-language parameters. To show its power, we have applied the theory to the literary corpus of Maria Valtorta, an Italian mystic of the XX-century. The likeness index , ranging from 0 to 1, allows to “measure” how two linguistic channels are similar, therefore implying that a character speaks to different audiences in the same way. A 6-dB difference between the signal-to-noise ratios of two channels already gives I<sub>L</sub> ≈ 0.5, a threshold below which the two channels depend very little on each other, therefore implying that the character addresses different audiences differently. In conclusion, multiple linguistic channels can describe the “fine tuning” that a literary author uses to diversify characters or distinguish the behavior of the same character in different situations. The theory can be applied to literary corpora written in any alphabetical language.展开更多
Biomedical questions are usually complex and regard several different life science aspects. Numerous valuable and he- terogeneous data are increasingly available to answer such questions. Yet, they are dispersedly sto...Biomedical questions are usually complex and regard several different life science aspects. Numerous valuable and he- terogeneous data are increasingly available to answer such questions. Yet, they are dispersedly stored and difficult to be queried comprehensively. We created a Genomic and Proteomic Data Warehouse (GPDW) that integrates data provided by some of the main bioinformatics databases. It adopts a modular integrated data schema and several metadata to describe the integrated data, their sources and their location in the GPDW. Here, we present the Web application that we developed to enable any user to easily compose queries, although complex, on all data integrated in the GPDW. It is publicly available at http://www.bioinformatics.dei.polimi.it/GPKB/. Through a visual interface, the user is only required to select the types of data to be included in the query and the conditions on their values to be retrieved. Then, the Web application leverages the metadata and modular schema of the GPDW to automatically compose an efficient SQL query, run it on the GPDW and show the extracted requested data, enriched with links to external data sources. Performed tests demonstrated efficiency and usability of the developed Web application, and showed its and GPDW relevance in supporting answering biomedical questions, also difficult.展开更多
It will show the feasibility of a Wireless Sensor Network (WSN) devoted to monitoring water basin, river, lake, and sea both on the surface and in depth. The swarm of floating probes can be programmed to periodically ...It will show the feasibility of a Wireless Sensor Network (WSN) devoted to monitoring water basin, river, lake, and sea both on the surface and in depth. The swarm of floating probes can be programmed to periodically sink some tens of meters below the surface, collecting data, characterizing water properties and then coming to the surface again. The life span of the probes may be assured by an on-board power supply or through batteries recharged by solar cells. The basic idea of the WSN is reported together with a detailed analysis of the operational constraints, the energy requirements, and the electronic and mechanical discussion.展开更多
基金supported by the National Key Research and Development Program of China(No.2023YFB4502200)Natural Science Foundation of China(Nos.92164204 and 62374063)the Science and Technology Major Project of Hubei Province(No.2022AEA001).
文摘Memtransistors in which the source-drain channel conductance can be nonvolatilely manipulated through the gate signals have emerged as promising components for implementing neuromorphic computing.On the other side,it is known that the complementary metal-oxide-semiconductor(CMOS)field effect transistors have played the fundamental role in the modern integrated circuit technology.Therefore,will complementary memtransistors(CMT)also play such a role in the future neuromorphic circuits and chips?In this review,various types of materials and physical mechanisms for constructing CMT(how)are inspected with their merits and need-to-address challenges discussed.Then the unique properties(what)and poten-tial applications of CMT in different learning algorithms/scenarios of spiking neural networks(why)are reviewed,including super-vised rule,reinforcement one,dynamic vision with in-sensor computing,etc.Through exploiting the complementary structure-related novel functions,significant reduction of hardware consuming,enhancement of energy/efficiency ratio and other advan-tages have been gained,illustrating the alluring prospect of design technology co-optimization(DTCO)of CMT towards neuro-morphic computing.
基金supported by the Atlantis International (Grant P11GJ10-0067)
文摘The high demand for lung transplants cannot be matched by an adequate number of lungs from donors. Since fully ex-novo lungs are far from being feasible, tissue engineering is actively considering implantation of engineered lungs where the devitalized structure of a donor is used as scaffold to be repopulated by stem cells of the receiving patient. A decellularized donated lung is treated inside a bioreactor where transport through the tracheobronchial tree (TBT) will allow for both deposition of stem cells and nourishment for their subsequent growth, thus developing new lung tissue. The key concern is to set optimally the boundary conditions to utilize in the bioreactor. We propose a predictive model of slow liquid ventilation, which combines a one-dimensional (1-D) mathematical model of the TBT and a solute deposition model strongly dependent on fluid velocity across the tree. With it, we were able to track and drive the concentration of a generic solute across the airways, looking for its optimal distribution. This was given by properly adjusting the pumps’ regime serving the bioreactor. A feedback system, created by coupling the two models, allowed us to derive the optimal pattern. The TBT model can be easily invertible, thus yielding a straightforward flow/pressure law at the inlet to optimize the efficiency of the bioreactor.
文摘Sorafenib is an effective anti-angiogenic treatment forhepatocellular carcinoma(HCC). The assessment of tumor progression in patients treated with sorafenib is crucial to help identify potentially-resistant patients,avoiding unnecessary toxicities. Traditional methods to assess tumor progression are based on variations in tumor size and provide unreliable results in patients treated with sorafenib. New methods to assess tumor progression such as the modified Response Evaluation Criteria in Solid Tumors or European Association for the Study of Liver criteria are based on imaging to measure the vascularization and tumor volume(viable or necrotic). These however fail especially when the tumor response results in irregular development of necrotic tissue. Newer assessment techniques focus on the evaluation of tumor volume,density or perfusion. Perfusion computed tomography and Dynamic ContrastEnhanced-UltraS ound can measure the vascularization of HCC lesions and help predict tumor response to antiangiogenic therapies. Mean Transit Time is a possible predictive biomarker to measure tumor response. Volumetric techniques are reliable,reproducible and time-efficient and can help measure minimal changes in viable tumor or necrotic tissue,allowing the prompt identification of non-responders. Volume ratio may be a reproducible biomarker for tumor response. Larger trials are needed to confirm the use of these techniques in the prediction of response to sorafenib.
文摘This research was focused on the valorisation of glycerol,exploring the feasibility of an efficient route for oxygenated additives production based on its etherification with bio-butanol.A home-made BEA zeolite sample with a tuneable acidity has been proposed as the catalytic system,being tested in a stirred reactor under different etherification conditions.Although a reaction temperature as high as 200℃resulted to be beneficial in terms of glycerol conversion(-90%),only by operating at milder conditions the product selectivity to glycerol-ethers can be better controlled,in order to obtain a bio-fuel complying with the requirements for mixing with fossil diesel or biodiesel,without any need of purification from large amount of by-products.A comprehensive identification of all the compounds formed during the reaction was performed by a GC-MS analysis,on the basis of the complex network of consecutive and parallel reaction paths leading not only to the desired ethers,but also to many side products not detected in similar acid-catalyzed reactions in liquid phase and not available in the most used mass-spectra libraries.
文摘Current IT society is populating the planet with a plethora of applications at an unprecedented rate pushed by advances in fabrication, mechatronics and communication technologies on one hand and availability of sophisticated sensor and actuator devices on the other[1] . Applications, that both target everyday life, industrial and mission critical needs, are either confined at a single target device or distributed within a net- work of units, by also taking advantage of seamless commu-nication capabilities and address different application scenarios. Examples of this trend are the Internet-of-Things, smart-whatsoever (home, grid, building, city, planet) and cyber-physical systems[2].
文摘The present paper aims to describe the conceptual idea to use cars as sensors to measure and acquire data related road environment. The parameters are collected using only standard equipment commonly installed and operative on commercial cars. Real sensors and car sub-systems (e.g. thermometers, accelerometers, ABS, ESP, and GPS) together with other “implicit” sensors (e.g. fog lights, windscreen wipers) acquire and contain information. They are shared inside an in-vehicle communication network using mainly the standard CAN bus and can be collected by a simple central node. This node can also be available on the market without too expensive costs thanks to some companies which business is devoted to car fleet monitoring. All the collected data are then geolocalized using a standard GPS receiver and sent to a remote elaboration unit, exploiting mobile network technologies such as GPRS or UMTS. A large number of cars, connected together in a diffuse Wireless Sensor Network, allow the elaboration unit to realize some info-layers put at the disposal of a car driver. Traffic, state of the road and other information about the weather can be received by car drivers using an ad hoc developed mobile application for smartphone which can give punctual information related to a specific route, previously set on the mobile phone navigator. The description of some experimental activities is presented, some technical points will be addressed and some examples of applications of the network of cars “as sensors” will be given.
文摘Cardiovascular disease (CVD) risk assessment is an important instrument to enhance the clinical decision in the daily practice as well as to improve the preventive health care promoting the transfer from the hospital to patient’s home. Due to its importance, clinical guidelines recommend the use of risk scores to predict the risk of a cardiovascular disease event. Therefore, there are several well known risk assessment tools, unfortunately they present some limitations.This work addresses this problem with two different methodologies:1) combination of risk assessment tools based on fusion of Bayesian classifiers complemented with genetic algorithm optimization;2) personalization of risk assessment through the creation of groups of patients that maximize the performance of each risk assessment tool. This last approach is implemented based on subtractive clustering applied to a reduced-dimension space.Both methodologies were developed to short-term CVD risk prediction for patients with Acute Coronary Syndromes without ST segment eleva-tion (ACS-NSTEMI). Two different real patients’ datasets were considered to validate the developed strategies:1) Santa Cruz Hospital, Portugal, N=460 patients;2)LeiriaPombal Hospital Centre, Portugal, N=99 patients.This work improved the performance in relation to current risk assessment tools reaching maximum values of sensitivity, specificity and geometric mean of, respectively, 80.0%, 82.9%, 81.5%. Besides this enhancement, the proposed methodologies allow the incorporation of new risk factors, deal with missing risk factors and avoid the selection of a single tool to be applied in the daily clinical practice. In spite of these achievements, the CVD risk assessment (patient stratification) should be improved. The incorporation of new risk factors recognized as clinically significant, namely parameters derived from heart rate variability (HRV), is introduced in this work. HRV is a strong and independent predictor of mortality in patients following acute myocardial infarction. The impact of HRV parameters in the characterization of coronary artery disease (CAD) patients will be conducted during hospitalization of these patients in the Leiria-Pombal Hospital Centre (LPHC).
文摘We study the short-term memory capacity of ancient readers of the original New Testament written in Greek, of its translations to Latin and to modern languages. To model it, we consider the number of words between any two contiguous interpunctions I<sub>p</sub>, because this parameter can model how the human mind memorizes “chunks” of information. Since I<sub>P</sub> can be calculated for any alphabetical text, we can perform experiments—otherwise impossible— with ancient readers by studying the literary works they used to read. The “experiments” compare the I<sub>P</sub> of texts of a language/translation to those of another language/translation by measuring the minimum average probability of finding joint readers (those who can read both texts because of similar short-term memory capacity) and by defining an “overlap index”. We also define the population of universal readers, people who can read any New Testament text in any language. Future work is vast, with many research tracks, because alphabetical literatures are very large and allow many experiments, such as comparing authors, translations or even texts written by artificial intelligence tools.
基金supported in part by the National Natural ScienceFoundation of China(61533017,61973330,61773075,61603387)the Early Career Development Award of SKLMCCS(20180201)the State Key Laboratory of Synthetical Automation for Process Industries(2019-KF-23-03)。
文摘In this paper, a data-based fault tolerant control(FTC) scheme is investigated for unknown continuous-time(CT)affine nonlinear systems with actuator faults. First, a neural network(NN) identifier based on particle swarm optimization(PSO) is constructed to model the unknown system dynamics. By utilizing the estimated system states, the particle swarm optimized critic neural network(PSOCNN) is employed to solve the Hamilton-Jacobi-Bellman equation(HJBE) more efficiently.Then, a data-based FTC scheme, which consists of the NN identifier and the fault compensator, is proposed to achieve actuator fault tolerance. The stability of the closed-loop system under actuator faults is guaranteed by the Lyapunov stability theorem. Finally, simulations are provided to demonstrate the effectiveness of the developed method.
基金the financial support by MOST (2011CBA00504)NSFC (21133010, 50921004, 212111074) of China
文摘Nanocarbon materials play a critical role in the development of new or improved technologies and devices for sustainable production and use of renewable energy. This perspective paper defines some of the trends and outlooks in this exciting area, with the effort of evidencing some of the possibilities offered from the growing level of knowledge, as testified from the exponentially rising number of publications, and putting bases for a more rational design of these nanomaterials. The basic members of the new carbon family are fullerene, graphene, and carbon nanotube. Derived from them are carbon quantum dots, nanohorn, nanofiber, nano ribbon, nanocapsulate, nanocage and other nanomorphologies. Second generation nanocarbons are those which have been modified by surface functionalization or doping with heteroatoms to create specific tailored properties. The third generation of nanocarbons is the nanoarchitectured supramolecular hybrids or composites of the first and second genera- tion nanocarbons, or with organic or inorganic species. The advantages of the new carbon materials, relating to the field of sustainable energy, are discussed, evidencing the unique properties that they offer for developing next generation solar devices and energy storage solutions.
文摘Recent advances on the use of nanocarbon-based electrodes for the electrocatalytic conversion of gaseous streams of CO2 to liquid fuels are discussed in this perspective paper. A novel gas-phase electrocatalytic cell, different from the typical electrochemical systems working in liquid phase, was developed. There are several advantages to work in gas phase, e.g. no need to recover the products from a liquid phase and no problems of CO2 solubility, etc. Operating under these conditions and using electrodes based on metal nanoparticles supported over carbon nanotube (CNT) type materials, long C-chain products (in particular isopropanol under optimized conditions, but also hydrocarbons up to C8-C9) were obtained from the reduction of CO2. Pt-CNT are more stable and give in some cases a higher productivity, but Fe-CNT, particular using N-doped carbon nanotubes, give excellent properties and are preferable to noble-metal-based electrocatalysts for the lower cost. The control of the localization of metal particles at the inner or outer surface of CNT is an importact factor for the product distribution. The nature of the nanocarbon substrate also plays a relevant role in enhancing the productivity and tuning the selectivity towards long C-chain products. The electrodes for the electrocatalytic conversion of CO2 are part of a photoelectrocatalytic (PEC) solar cell concept, aimed to develop knowledge for the new generation artificial leaf-type solar cells which can use sunlight and water to convert CO2 to fuels and chemicals. The CO2 reduction to liquid fuels by solar energy is a good attempt to introduce renewables into the existing energy and chemical infrastructures, having a higher energy density and easier transport/storage than other competing solutions (i.e. H2).
文摘Statistics of languages are usually calculated by counting characters, words, sentences, word rankings. Some of these random variables are also the main “ingredients” of classical readability formulae. Revisiting the readability formula of Italian, known as GULPEASE, shows that of the two terms that determine the readability index G—the semantic index , proportional to the number of characters per word, and the syntactic index GF, proportional to the reciprocal of the number of words per sentence—GF is dominant because GC is, in practice, constant for any author throughout seven centuries of Italian Literature. Each author can modulate the length of sentences more freely than he can do with the length of words, and in different ways from author to author. For any author, any couple of text variables can be modelled by a linear relationship y = mx, but with different slope m from author to author, except for the relationship between characters and words, which is unique for all. The most important relationship found in the paper is that between the short-term memory capacity, described by Miller’s “7 ? 2 law” (i.e., the number of “chunks” that an average person can hold in the short-term memory ranges from 5 to 9), and the word interval, a new random variable defined as the average number of words between two successive punctuation marks. The word interval can be converted into a time interval through the average reading speed. The word interval spreads in the same range as Miller’s law, and the time interval is spread in the same range of short-term memory response times. The connection between the word interval (and time interval) and short-term memory appears, at least empirically, justified and natural, however, to be further investigated. Technical and scientific writings (papers, essays, etc.) ask more to their readers because words are on the average longer, the readability index G is lower, word and time intervals are longer. Future work done on ancient languages, such as the classical Greek and Latin Literatures (or modern languages Literatures), could bring us an insight into the short-term memory required to their well-educated ancient readers.
文摘Photovoltaic (PV) systems have attracted increasing attention in last years as well as Wireless Sensor Networks (WSNs), which have been used in many application fields. In PV plants, especially in ground installations, a lot of thefts and damages occur due to the still high cost of the modules. A new experimental WSN ad-hoc has been designed to be an anti-theft alarm system. Each node of the network is directly installed under each PV string and it is equipped with an accelerometer sensor capable to detect a minimum displacement of the panel from its steady position. The WSN presents a star topology: a master node cyclically interrogates the slave nodes through RF link. It collects all the nodes responses and communicates though a RS-232 interface with a control PC checking the network status. When a slave node detects an alarm, continuous messages are sent to the control PC which turns on all the alarm signaling systems. The control PC is equipped with an open source operative system and software and provides for SMS, e-mail and sound-light signaling in case of alarm. It also communicates with a remote server where all the WSN information is stored. A first low cost experimental WSN has been already installed and it is working properly.
文摘We propose the first statistical theory of language translation based on communication theory. The theory is based on New Testament translations from Greek to Latin and to other 35 modern languages. In a text translated into another language</span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">,</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"> all linguistic variables do numerically change. To study the chaotic data that emerge, we model any translation as a complex communication channel affected by “noise”, studied according to Communication Theory applied for the first time to this channel. This theory deals with aspects of languages more complex than those currently considered in machine translations. The input language is the “signal”, the output language is a “replica” of the input language, but largely perturbed by noise, indispensable, however, for conveying the meaning of the input language to its readers</span></span></span><span><span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><b><span style="font-family: Verdana;" cambria="" math","serif";"="">.</span></b></span></span><span style="font-family:""></span><span><span><span style="font-family:""><span style="font-family:Verdana;"> </span><span style="font-family:Verdana;">We have defined a noise-to-signal power ratio and found that channels are differently affected by translation noise. Communication channels are also characterized by channel capacity. The translation of novels has more constraints than the New Testament translations. We propose a global readability formula for alphabetical languages, not available for most of them, and conclude with a general theory of language translation which shows that direct and reverse channels are not symmetric. The general theory can also be applied to channels of texts belonging to the same language both to study how texts of the same author may have changed over time, or to compare texts of different authors. In conclusion, a common underlying mathematical structure governing human textual/verbal communication channels seems to emerge. Language does not play the only role in translation;this role is shared with reader’s reading ability and short-term</span></span></span></span><span><span><span style="font-family:""> </span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">memory capacity. Different versions of New Testament within the same language can even seem, mathematically, to belong to different languages. These conclusions are everlasting because valid also for ancient Roman and Greek readers.
文摘Quantitative precipitation estimation and rainfall monitoring based on meteorological data, potentially provides continuous, high-resolution and large-coverage data, are of high practical use: Think of hydrogeological risk management, hydroelectric power, road and tourism. Both conventional long-range radars and rain-gauges suffer from measurement errors and difficulties in precipitation estimation. For efficient monitoring operation of localized rain events of limited extension and of small basins of interest, an unrealistic extremely dense rain gauge network should be needed. Alternatively C-band or S-band meteorological long range radars are able to monitor rain fields over wide areas, however with not enough space and time resolution, and with high purchase and maintenance costs. Short-range X-band radars for rain monitoring can be a valid compromise solution between the two more common rain measurement and observation instruments. Lots of scientific efforts have already focused on radar-gauge adjustment and quantitative precipitation estimation in order to improve the radar measurement techniques. After some considerations about long range radars and gauge network, this paper presents instead some examples of how X-band mini radars can be very useful for the observation of rainfall events and how they can integrate and supplement long range radars and rain gauge networks. Three case studies are presented: A very localized and intense event, a rainfall event with high temporal and spatial variability and the employ of X-band mini radar in a mountainous region with narrow valleys. The adaptability of such radar devoted to monitor rain is demonstrated.
文摘Maria Valtorta (1897-1961, Italian mystic)—bedridden since 1934 because paralyzed—wrote in Italian 13,193 pages of 122 school notebooks concerning alleged mystical visions on Jesus’ life, during World War II and few following years. The contents—about 2.64 million words—are now scattered in different books. She could write from 2 to 6 hours without pausing, with steady speed, and twice in the same day. She never made corrections and was very proficient in Italian. We have studied her writing activity concerning her alleged mystical experience with the main scope of establishing the time sequence of daily writing. This is possible because she diligently annotated the date of almost every text. We have reconstructed the time series of daily words and have converted them into time series of writing time, by assuming a realistic speed of 20 words per minute, a reliable average value of fast handwriting speed, applicable to Maria Valtorta. She wrote for 1340 days, about 3.67 years of equivalent contiguous writing time, mostly concentrated in the years 1943 to 1948. This study is a first approach in evaluating the effort done, in terms of writing time, by a mystic turned out to be a very effective literary author, whose texts are interesting to read per se, beyond any judgement—not of concern here—on her alleged visions.
文摘In this paper, a data-driven prognostic model capable to deal with different sources of uncertainty is proposed. The main novelty factor is the application of a mathematical framework, namely a Random Fuzzy Variable (RFV) approach, for the representation and propagation of the different uncertainty sources affecting </span><span style="font-family:Verdana;">Prognostic Health Management (PHM) applications: measurement, future and model uncertainty. </span><span style="font-family:Verdana;">In this way, it is possible to deal not only with measurement noise and model parameters uncertainty due to the stochastic nature of the degradation process, but also with systematic effects, such as systematic errors in the measurement process, incomplete knowledge of the degradation process, subjective belief about model parameters. Furthermore, the low analytical complexity of the employed prognostic model allows to easily propagate the measurement and parameters uncertainty into the RUL forecast, with no need of extensive Monte Carlo loops, so that low requirements in terms of computation power are needed. The model has been applied to two real application cases, showing high accuracy output, resulting in a potential</span></span><span style="font-family:Verdana;">ly</span><span style="font-family:Verdana;"> effective tool for predictive maintenance in different industrial sectors.
文摘The statistical theory of language translation is used to compare how a literary character speaks to different audiences by diversifying two important linguistic communication channels: the “sentences channel” and the “interpunctions channel”. The theory can “measure” how the author shapes a character speaking to different audiences, by modulating deep-language parameters. To show its power, we have applied the theory to the literary corpus of Maria Valtorta, an Italian mystic of the XX-century. The likeness index , ranging from 0 to 1, allows to “measure” how two linguistic channels are similar, therefore implying that a character speaks to different audiences in the same way. A 6-dB difference between the signal-to-noise ratios of two channels already gives I<sub>L</sub> ≈ 0.5, a threshold below which the two channels depend very little on each other, therefore implying that the character addresses different audiences differently. In conclusion, multiple linguistic channels can describe the “fine tuning” that a literary author uses to diversify characters or distinguish the behavior of the same character in different situations. The theory can be applied to literary corpora written in any alphabetical language.
文摘Biomedical questions are usually complex and regard several different life science aspects. Numerous valuable and he- terogeneous data are increasingly available to answer such questions. Yet, they are dispersedly stored and difficult to be queried comprehensively. We created a Genomic and Proteomic Data Warehouse (GPDW) that integrates data provided by some of the main bioinformatics databases. It adopts a modular integrated data schema and several metadata to describe the integrated data, their sources and their location in the GPDW. Here, we present the Web application that we developed to enable any user to easily compose queries, although complex, on all data integrated in the GPDW. It is publicly available at http://www.bioinformatics.dei.polimi.it/GPKB/. Through a visual interface, the user is only required to select the types of data to be included in the query and the conditions on their values to be retrieved. Then, the Web application leverages the metadata and modular schema of the GPDW to automatically compose an efficient SQL query, run it on the GPDW and show the extracted requested data, enriched with links to external data sources. Performed tests demonstrated efficiency and usability of the developed Web application, and showed its and GPDW relevance in supporting answering biomedical questions, also difficult.
文摘It will show the feasibility of a Wireless Sensor Network (WSN) devoted to monitoring water basin, river, lake, and sea both on the surface and in depth. The swarm of floating probes can be programmed to periodically sink some tens of meters below the surface, collecting data, characterizing water properties and then coming to the surface again. The life span of the probes may be assured by an on-board power supply or through batteries recharged by solar cells. The basic idea of the WSN is reported together with a detailed analysis of the operational constraints, the energy requirements, and the electronic and mechanical discussion.