Brain-computer interface (BCI) has been extensively studied as a means of restoring sensorimotor functions. The BCI enables individuals to use the electrical activity signals of their own brain that are recorded by electrodes, to control an external device through decoding, translating, and actuating. Most state-of-the-art decoding techniques rely on offline analysis, making it impractical for portable BCI to implement complex computation on hardware. On the other hand, classification capability based on look-up table is limited in on-chip implementation. An on-chip intelligent system based on Artificial Neural Network (ANN) has been designed that can effectively perform ECoG data signal decoding of a single finger movement. The main building blocks of this decoding architecture are a hardware friendly version of principle component analysis (PCA) and a multi-layer perceptron (MLP). In this thesis, we mainly focus on the hardware implementation of multi-layer perceptron that can perform movement classification. Training of the neural network is carried out and learned weights are used to model the ANN in Verilog hardware description language and made it FPGA implementable. Various architectures of ANN were considered to optimize the design in terms of performance trade-offs such as area, power, speed and accuracy. Our proposed architecture can predict single finger movements with more than 80% accuracy. This implementation will serve as a pathway to develop a real-time BCI system capable of predicting volitional movement intentions.