Long Short Term Memory (LSTM) networks have achieved tremendous successes for making sense of time-series data in multiple domains, including financial forecasting. However, LSTM models are often designed to deal with time-series sequences of a fixed length, for which the optimal setting is highly varying for different data over time. A common practice is to train multiple LSTM models with different lengths and use these models to vote for the final forecast. This bagging idea is more effective than individual models but are computationally expensive. To overcome this issue, this dissertation work presents an innovative ensemble LSTM model, which employs a set of LSTM models for robust forecasting but requires a significantly less amount of computational resources. Our approach is characterized by two-fold comparing to existing methods. Firstly, these individual LSTM networks in the bag share network parameters with each other and the total number of network parameters are much less than the sum of the individual LSTM’s parameters. This design would substantially improve computational efficiency of the forecast system. Secondly, a weighting schema is introduced to assign different confidences to individual LSTM models. We apply the proposed method to forecast currency rates between US dollars and Euro. We also implemented and test two other bagging techniques for comparative studies. Experiments over the historical currency data suggested that the proposed ensemble model could outperform other baseline methods while using four times less computational resources. We also apply the ensemble LSTM over other three financial forecasts problems: CCI index, SP500 index and U.S. Unemployment rate and obtained consistent improvements in terms of both performance and efficiency.