Sliding Mode Control.
seminar surveyer Active In SP Posts: 3,541 Joined: Sep 2010 
28122010, 01:42 PM
Submitted By:
Shibayan Chatterjee SMC_report.doc (Size: 861 KB / Downloads: 61) Introduction. Definition. Control theory is a combined branch of mathematics and Engineering that deals with the dynamical systems. The desired output of the system is called reference and when one or more outputs desire to follow the reference over time a controller is designed and. It helps in manipulating the input to provide the desired output. The role of control theory can be understood with the help of a simple example. Let us consider the driving system of an automobile. The desired speed can be changed or maintained by controlling the pressure on the accelerator pedal. This automobile system constitutes a control strategy where the input to the system is the pressure on the accelerator pedal which causes the carburetor valve to open or close so as to increase or decrease the fuel flowing to the engine, thus bringing the automobile speed to a controllable form(figure 1). The diagrammatic representation of the above example is shown below. Here each block represents an element, device, plant or a mechanism. Each block has an input and output signal which are linked to each other with the help of a particular relationship. Types of Control Systems. Control Systems are basically of two types open loop control system and closed loop control system. Open loop control system deals with those physical systems which does not automatically correct for variations in its output. For these kind of systems the out remains invariant for a particular input until the external conditions are altered. The output can only be changed by changing the internal parameters, external conditions of the system. Normally open control are used in those cases where the system is capable of withstanding these kind of variations Closed Loop system on the other hand are those systems where the input to the system depends on the output, the output signal is trapped and compared with the desired output and it in turn generates an error signal which is feed to the controller. The feedback system is usually a sensor or a transducer which keeps continuous track of the output signals. These controllers are normally insensitive to changes in external factors because the error signal generates a desire=red output keeping in voes ythe error signal also.(figure 2,3) For designing of any control system it is essential to know the physical characteristics of the system whose controlling is to be done. To know the physical system it is essential to know its output and input parameters. An idealized physical system is called a physical model. A physical model should be made under certain degree of accuracy with a specific type of problem. The next thin which is done is to have a mathematical formulae describing the system. This may be a differential equation, integral equation, or any other input output equation which completely defines the system. This equation defines the control block and hence is used for control system problems. The Control theory is basically of two types: Linear Control Theory and NonLinear Control Theory. Linear Control Systems. Linear Control Theory deals with those systems which are linear in nature. The output is proportional to input and which are mainly concerned with Single Input and Single Output systems. NonLinear Control Systems. NonLinear control Theory deals with those systems which are non linear, time invariant in nature, which does not follow any specific where the output does not follow ant=y specific pattern with the input and has multiple inputs and multiple outputs. Most of the systems found physically are non linear in nature. So it becomes very important to learn nonlinear systems. Single Input Single Output systems are concerned with only a single input and opts corresponding output. Multiple input multiple out system deals with multiple no’s of input and their out puts at specific times. Usually a MIMO systems can be analyzed using SISO systems but with certain degree of approximations. Here in that a case only one input has to be taken and all other are taken to be zero and the outputs are noted .This thing is then repeated for all other inputs and their outputs are noted. Controllability and observability Controllability and observability are main issues in the analysis of a system before deciding the best control strategy to be applied, or whether it is even possible to control or stabilize the system. Controllability is related to the possibility of forcing the system into a particular state by using an appropriate control signal. If a state is not controllable, then no signal will ever be able to control the state. If a state is not controllable, but its dynamics are stable, then the state is termed Stabilizable. Observability instead is related to the possibility of "observing", through output measurements, the state of a system. If a state is not observable, the controller will never be able to determine the behavior of an unobservable state and hence cannot use it to stabilize the system. However, similar to the stabilizability condition above, if a state cannot be observed it might still be detectable. Solutions to problems of uncontrollable or unobservable system include adding actuators and sensors. Control Strategies. Every control system must guarantee first the stability of the closedloop behavior. For linear systems, this is obtained by directly placing the poles. Nonlinear control systems use specific theories to ensure stability without regard to the inner dynamics of the system. The possibility to fulfill different specifications varies from the model considered and the control strategy chosen. Here a summary list of the main control techniques is shown: Adaptive control Adaptive control uses online identification of the process parameters, or modification of controller gains, thereby obtaining strong robustness properties. Adaptive controls were applied for the first time in the aerospace industry in the 1950s, and have found particular success in that field. Hierarchical control A Hierarchical control system is a type of Control System in which a set of devices and governing software is arranged in a hierarchical order When the links in the tree are implemented by a computer network, then that hierarchical control system is also a form of Networked control system. Intelligent control Intelligent control uses various AI computing approaches like neural networks, Bayesian probability, fuzzy logic, machine learning, evolutionary computation and genetic algorithms to control a dynamic system. Optimal control Optimal control is a particular control technique in which the control signal optimizes a certain "cost index": for example, in the case of a satellite, the jet thrusts needed to bring it to desired trajectory that consume the least amount of fuel. Two optimal control design methods have been widely used in industrial applications, as it has been shown they can guarantee closedloop stability. These are Model Predictive Control (MPC) and LinearQuadraticGaussian control Robust control Robust control deals explicitly with uncertainty in its approach to controller design. Controllers designed using robust control methods tend to be able to cope with small differences between the true system and the nominal model used for design. Stochastic control Stochastic control deals with control design with uncertainty in the model. In typical stochastic control problems, it is assumed that there exist random noise and disturbances in the model and the controller, and the control design must take into account these random deviations. 



Important Note..!
If you are not satisfied with above reply ,..PleaseASK HERE
So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this pageTagged Pages: use of sliding mode doc, 