List of Journals    /    Call For Papers    /    Subscriptions    /    Login
 
 
 
 
 SEARCH
By Author By Title
 
 
ABOUT CSC
 About CSC Journals
 CSC Journals Objectives
 List of Journals
 CALL FOR PAPERS
 Call For Papers CFP
 Special Issue CFP
AUTHOR GUIDELINES
 Submission Guidelines
 Peer Review Process
 Helpful Hints For Getting Published
 Plagiarism Policies
 Abstracting & Indexing
 Open Access Policy
 Submit Manuscript
 FOR REVIEWERS
 Reviewer Guidelines
 FOR EDITORIAL
 Editor Guidelines
 Join Us As Editor
 Launch Special Issue
 Suggest New Journal
 CSC LIBRARY
 Browse CSC Library
 Open Access Policy
  SERVICES
 Conference Partnership Program (CPP)
 Abstracting & Indexing
 SUBSCRIPTIONS
 Subscriptions
 Discounted Packages
 Archival Subscriptions
 How to Subscribe
 Librarians
 Subscriptions Agents
 Order Form
 DOWNLOADS
 
 
 
 
Forecasting Electric Energy Demand using a predictor model based on Liquid State Machine
Full text
 PDF(321.4KB)
Source 
International Journal of Artificial Intelligence and Expert Systems (IJAE)
Table of Contents
Download Complete Issue    PDF(3.2MB)
Volume:  1    Issue:  2
Pages:  7-53
Publication Date:   July 2010
ISSN (Online): 2180-124X
Pages 
40 - 53
Author(s)  
 
Published Date   
10-08-2010 
Publisher 
CSC Journals, Kuala Lumpur, Malaysia
ADDITIONAL INFORMATION
Keywords   Abstract   References   Cited by   Related Articles   Collaborative Colleague
 
KEYWORDS:   Liquid State Machine, Pulsed Neural Networks, Prediction, Electric Energy Demand 
 
 
This Manuscript is indexed in the following databases/websites:-
1. PDFCAST
2. Scribd
3. Directory of Open Access Journals (DOAJ)
4. Docstoc
5. Google Scholar
6. Academic Index
7. refSeek
 
 
Electricity demand forecasts are required by companies who need to predict their customers’ demand, and by those wishing to trade electricity as a commodity on financial markets. It is hard to find the right prediction method for a given application if not a prediction expert. Recent works show that Liquid State Machines (LSM’s) can be applied to the prediction of time series. The main advantage of the LSM is that it projects the input data in a high-dimensional dynamical space and therefore simple learning methods can be used to train the readout. In this paper we present an experimental investigation of an approach for the computation of time series prediction by employing Liquid State Machines (LSM) in the modeling of a predictor in a case study for short-term and long-term electricity demand forecasting. Results of this investigation are promising, considering the error to stop training the readout, the number of iterations of training of the readout and that no strategy of seasonal adjustment or preprocessing of data was achieved to extract non-correlated data out of the time series. 
 
 
 
1 H. Burgsteiner, M. Kröll, A. Leopold, G. Steinbauer. “Movement prediction from real-world images using a liquid state machine”. Applied Intelligence, 36(2):99-109, 2007
2 F. Wyffels, B. Schrauwen. “A comparative study of reservoir computing strategies for monthly time series prediction”. Neurocomputing, 73(10-13):1958-1964, 2010
3 E. Joyce, T. Berg, A. Rietz. “Prediction markets as decision support systems”. Information Systems Frontiers, 5(1):79-93, 2003
4 T. Coulson, G. M. Mace, E. Hudson, H. Possingham. “The use and abuse of population viability analysis”. Trends in Ecology & Evolution, 16(5):219-2211, 2001
5 S. Nickovic, G. Kallos, A. Papadopoulos, O. Kakaliagou. “A model for prediction of desert dust cycle in the atmosphere”. Journal of Geophysical Research, 106(D16):18113-18129, 2001
6 R. Drossu, Z. Obradovic. “Rapid design of neural networks for time series prediction”. IEEE Computational Science & Engineering, 3(2):78-89, 1996
7 S. Haykin. “Neural Networks: a new comprehensive foundation”, Prentice-Hall, 2nd ed., New Jersey, (1999)
8 A. F. Atiya, A. G. Parlos. “New results on recurrent network training: unifying the algorithms and accelerating convergence”. IEEE Transactions in Neural Networks, 11(3):697-709, 2000
9 W. Maass, T. Natschläger, H. Markram. “Real-time computing without stable states: a new framework for neural computation based on perturbations”. Neural Computation, 14(11):2531–2560, 2002
10 H. Jaeger. “The “echo state” approach to analysing and training recurrent neural networks”. Technical Report. Fraunhofer Institute for Autonomous Intelligent Systems: German National Research Center for Information Technology (GMD Report 148), 2001
11 D. Verstraeten, B. Schrauwen, M. D’Haene, D. Stroobandt. “An experimental unification of reservoir computing methods”. Neural Networks, 20(Special Issue):391-403, 2007a
12 A. Lazar, G. Pipa, J. Triesch. “Fading memory and times series prediction in recurrent networks with different forms of plasticity”. Neural Networks, 20(3):312-322, 2007
13 L. Pape, J. Gruijl, M. Wiering. “Democratic liquid state machines for music recognition”. Studies in Computational Intelligence (SCI), 83:191-211, 2008
14 C. Gros, G. Kaczor. “Semantic learning in autonomously active recurrent neural networks”. Logic Journal of IGPL Online, jzp045v1-jzp045, 2009
15 H. Paugam-Moisy. “Spiking Neuron Networks a Survey”. Technical Report IDIAP RR 06-11. IDIAP Research Institute, 2006
16 J. Vreeken. “On real-world temporal pattern recognition using Liquid State Machines”. Master’s Thesis, University Utrecht (NL), 2004
17 E. A. Antonelo, B.Schrauwen, X. Dutoit, D. Stroobandt, M. Nuttin. “Event detection and localization in mobile robot navigation using reservoir computing”. In Proccedings of 17th International Conference on Artificial Neural Networks, LNCS 4669: 660-669, Porto, Portugal, 2007
18 IGI LSM Group. “Circuit-Tool: a tool for generating neural microcircuits”. User Manual. Institute for Theoretical Computer Science, Graz University of Technology. Available in http://www.lsm.turgraz.at, 2006
19 S. Häusler, H. Markram, W. Maass. “Perspectives of the high dimensional dynamics of neural microcircuits from the point of view of low dimensional readouts”. Complexity, 8(4)39-50, 2003
20 E. Goodman, D. Ventura. “Spatiotemporal pattern recognition via liquid state machines”. In Proceedings of the International Joint Conference on Neural Networks. Vancouver, BC, Canada, 3848-3853, 2006
21 B. Schrauwen, D. Verstraeten, J. Van Campenhout. “An overview of reservoir computing: theory, applications and implementations”. In Proceedings of the 15th European Symposium on Artificial Neural Networks. 471-482, Bruges, Belgium, 2007
22 T. Natschläger, W. Maass, H. Markram. “The “liquid computer”: a novel strategy for real-time computing on time series”. Special Issue on Foundations of Information Processing of Telematik, 8(1):39-43, 2002
23 X. Dutoit, B. Schrauwen, J. Van Campenhout, D. Stroobandt, H. Van Brussel, M. Nuttin. “Pruning and regularization in reservoir computing”. Neurocomputing, 72(7-9):1534-1546, 2009
24 K. P. Dockendorf, I. Park, P. Hea, J. C. Príncipe, T. B. DeMarse. “Liquid state machines and cultured cortical networks: The separation property”. BioSystems, 95(2):90-97, 2009
25 D. V. Buonomano, M. M. Merzenich. “Temporal information transformed into a spatial code by a neural network with realistic properties”. Science, 267:1028-1030, 1995
26 W. Maass, H. Markram. “On the computational power of circuits of spiking neurons”. Journal of Computer and System Sciences, 69(4):593-616, 2004
27 A. Ghani, M. McGinnity, L. Maguire, J. Harkin. “Hardware/software co-desing for spike based recognition”. Computer Research Repository CoRR abs/0807.2282, 2008
28 E. Goodman, D. Ventura. “Effectively using recurrently connected spiking neural networks”. In Proceedings of the International Joint Conference on Neural Networks. (3):1542–1547, Montreal, Canada, 2005
29 D. Verstraeten, B. Schrauwen, D. Stroobandt. “Adapting reservoirs to get gaussian distributions”. In Proceedings of the 15th European Symposium on Artificial Neural Networks. 495–500, Bruges, Belgium, 2007b
30 P. Knüsel, R. Wyss, P. König, P. F. M. J. Verschure. “Decoding a temporal population code”. Neural Computation, 16(10):2079-2100, 2004
31 R. Vink. “Temporal pattern analysis using reservoir computing”. Master’s Thesis. Leiden Institute of Advanced Computer Science, Universiteit Leiden, 2009
32 T. Natschläger. “CSIM: a neural Circuit SIMulator”. User Manual. Institute for Theoretical Computer Science, Graz University of Technology. Available in http://www.lsm.tugraz.at/csim/index.html, 2006
33 IGI LSM Group. “Learning-Tool: analysing the computational power of neural microcircuits”. User Manual. Institute for Theoretical Computer Science, Graz University of Technology. Available in http://www.lsm.turgraz.at, 2006
34 F. Wyffels, B. Schrauwen, D. Stroobandt. “Using reservoir computing in a decomposition approach for time series prediction”. In Proceedings of the European Symposium on Time Series Prediction, 149-158, Provo, Finland, 2008
35 S. Biswas, S. P. Mishra, S. Acharya, S. Mohanty. “A hybrid oriya named entity recognition system: harnessing the power of rule”. International Journal of Artificial Intelligence and Expert Systems (IJAE), 1(1):1-6, 2010
 
 
 
 
 
 
 
 
Neusa Grando : Colleagues
Tania Mezzadri Centeno : Colleagues
Sílvia Silva da Costa Botelho : Colleagues
Felipe Michels Fontoura : Colleagues  
 
 
 
  Untitled Document
 
Copyrights (c) 2012 Computer Science Journals. All rights reserved.
Best viewed at 1152 x 864 resolution. Microsoft Internet Explorer.
 
  
 
Copyrights & Usage: Articles published by CSC Journals are Open Access. Permission to copy and distribute any other content, images, animation and other parts of this website is prohibited. CSC Journals has the rights to take action against individual/group if they are found victim of copying these parts of the website.