NeuralFin-Backend is the server-side component of the NeuralFin project, designed to process and analyze financial data from companies in the Middle East and North Africa (MENA) region. It provides APIs and services that support the NeuralFin-Frontend, delivering data and insights to users.
- Introduction
- Features
- Installation
- Usage
- Project Structure
- API Endpoints
- Data Processing
- Metrics Calculation
- Contributing
- License
NeuralFin-Backend serves as the core engine for the NeuralFin platform, handling data ingestion, processing, and analysis. It integrates with various data sources to collect financial information, processes this data to extract meaningful insights, and exposes APIs consumed by the front-end application.
- Data Ingestion: Collects financial data from multiple sources, including stock prices, financial statements, and economic indicators.
- Data Processing: Cleanses and transforms raw data into structured formats suitable for analysis.
- Financial Metrics Calculation: Computes key financial ratios and metrics to assess company performance.
- API Services: Provides RESTful APIs for the front-end to retrieve processed data and analytics.
- Authentication and Authorization: Manages user authentication and access control for secure data access.
To set up the NeuralFin-Backend locally, follow these steps:
- Clone the Repository:
git clone https://github.com/ranystephan/NeuralFin-Backend.git
- Navigate to the Project Directory:
cd NeuralFin-Backend
- Install Dependencies:
npm install
- Set Up Environment Variables:
Create a
.env
file in the root directory and define the necessary environment variables, such as database connection strings and API keys. - Run Database Migrations:
npm run migrate
- Start the Server:
npm start
Once the server is running, it will listen for incoming HTTP requests on the configured port (default is 3000). You can interact with the API endpoints using tools like Postman or through the NeuralFin-Frontend application.
The project's directory structure is organized as follows:
NeuralFin-Backend/
├── src/
│ ├── controllers/
│ ├── models/
│ ├── routes/
│ ├── services/
│ ├── utils/
│ └── index.js
├── tests/
├── migrations/
├── .env
├── package.json
└── README.md
src/controllers/
: Contains controllers that handle incoming requests and orchestrate responses.src/models/
: Defines data models and schemas for interacting with the database.src/routes/
: Sets up API routes and associates them with corresponding controllers.src/services/
: Implements business logic and data processing functionalities.src/utils/
: Includes utility functions and helpers used across the application.src/index.js
: Entry point of the application, initializing the server and connecting to the database.tests/
: Contains test cases to ensure the reliability of the application.migrations/
: Holds database migration scripts for schema changes..env
: Environment configuration file (not included in version control).
The backend exposes several RESTful API endpoints for data retrieval and manipulation. Below is an overview of the primary endpoints:
- User Authentication:
- POST
/api/auth/register
: Registers a new user. - POST
/api/auth/login
: Authenticates a user and returns a JWT token.
- POST
- Financial Data:
- GET
/api/companies
: Retrieves a list of companies in the MENA region. - GET
/api/companies/:id
: Fetches detailed information for a specific company. - GET
/api/companies/:id/metrics
: Obtains calculated financial metrics for a company.
- GET
- Market Data:
- GET
/api/markets
: Provides data on various financial markets. - GET
/api/markets/:id
: Retrieves information about a specific market.
- GET
For a comprehensive list of endpoints and their functionalities, refer to the API documentation.
Data processing is a critical component of NeuralFin-Backend, involving several steps:
- Data Ingestion: Collects raw financial data from external APIs, databases, and other sources.
- Data Cleaning: Removes inconsistencies, handles missing values, and standardizes data formats.
- Data Transformation: Converts raw data into structured formats, creating necessary fields and relationships.
- Data Storage: Saves processed data into the database for efficient retrieval and analysis.
These processes ensure that the data served to the front-end is accurate, up-to-date, and relevant.
The backend calculates various financial metrics to assess company performance. Below are examples of such metrics along with their calculation formulas:
-
Price-to-Earnings (P/E) Ratio:
$$\text{P/E Ratio} = \frac{\text{Market Price per Share}}{\text{Earnings per Share (EPS)}}$$
The backend of NeuralFin supports the computation of both fundamental financial metrics and advanced quantitative finance metrics. These include measures for portfolio optimization, risk assessment, and performance evaluation.
The Markowitz model helps in selecting the optimal portfolio by maximizing returns for a given level of risk or minimizing risk for a given level of returns.
Key Formulas:
-
Portfolio Expected Return:
$$E(R_p) = \sum_{i=1}^n w_i E(R_i)$$ Where (w_i) is the weight of asset (i) in the portfolio and (E(R_i)) is its expected return. -
Portfolio Variance:
$$\sigma_p^2 = \sum_{i=1}^n \sum_{j=1}^n w_i w_j \sigma_{ij}$$ Where (\sigma_{ij}) is the covariance between assets (i) and (j).
Implementation (Pseudocode):
import numpy as np
# Covariance matrix and expected returns
cov_matrix = np.array([[0.04, 0.02], [0.02, 0.03]])
expected_returns = np.array([0.10, 0.12])
# Portfolio weights
weights = np.array([0.6, 0.4])
# Expected return
portfolio_return = np.dot(weights, expected_returns)
# Portfolio variance
portfolio_variance = np.dot(weights.T, np.dot(cov_matrix, weights))
-
Beta: Measures the sensitivity of a stock’s returns to market returns.
$$\beta = \frac{\text{Cov}(R_i, R_m)}{\text{Var}(R_m)}$$ -
Alpha: Represents the excess return of an asset or portfolio above its expected return based on beta.
$$\alpha = R_i - \left(\beta \cdot R_m + \text{Risk-Free Rate}\right)$$
Implementation:
def calculate_beta(stock_returns, market_returns):
covariance = np.cov(stock_returns, market_returns)[0, 1]
variance_market = np.var(market_returns)
return covariance / variance_market
def calculate_alpha(actual_return, beta, market_return, risk_free_rate):
expected_return = beta * market_return + risk_free_rate
return actual_return - expected_return
VaR estimates the maximum potential loss of a portfolio over a specified time period with a given confidence level.
Formula:
Implementation:
from scipy.stats import norm
def calculate_var(portfolio_mean, portfolio_std, confidence_level):
z = norm.ppf(confidence_level)
return -(portfolio_mean + z * portfolio_std)
CVaR calculates the average loss beyond the VaR threshold.
Formula:
Implementation:
def calculate_cvar(losses, var):
return np.mean([loss for loss in losses if loss > var])
Measures the risk-adjusted return of a portfolio.
Implementation:
def calculate_sharpe_ratio(portfolio_return, risk_free_rate, portfolio_std):
return (portfolio_return - risk_free_rate) / portfolio_std
A variation of the Sharpe Ratio that considers only downside risk.
Expected Shortfall provides a more comprehensive risk measure by capturing the average loss in the tail distribution of losses.
Implementation:
def calculate_expected_shortfall(losses, alpha):
sorted_losses = sorted(losses)
threshold_index = int(len(losses) * alpha)
return np.mean(sorted_losses[:threshold_index])
-
Data Ingestion:
- Financial data (price history, volume, fundamentals) is collected from external APIs (e.g., Refinitiv, Bloomberg).
- Data is ingested into a PostgreSQL database.
-
Data Processing:
- Data is cleaned, normalized, and aggregated.
- Financial metrics and risk measures are computed.
-
API Exposure:
- Results are served through RESTful endpoints for use by the NeuralFin-Frontend.
- Fork the repository.
- Create a branch for your feature or fix.
- Submit a pull request.
This project is licensed under the MIT License.
NeuralFin-Backend serves as the analytical powerhouse for NeuralFin, enabling robust and scalable analysis of financial data. Its comprehensive metric calculations and RESTful APIs facilitate the integration of advanced financial analytics with user-friendly front-end interfaces.