Mathematical modeling competition and SZ education in Colleges and Universities
Title Description
Establishment and solution of model
research status
Analytic hierarchy process plus data visualization chart
compared with other college mathematics courses, the teaching content and teaching process of mathematical modeling contain more sz elements. One of the reasons is that its teaching content mainly comes from the students' empathy with the actual life, so it is easy to be recognized by the students; On the other hand, the teaching of mathematical modeling requires students to practice in person, which overcomes the disadvantages of other mathematical teaching links, and is easy to make sz thought deeply rooted in the hearts of the people. The unique role of mathematical modeling in sz education is reflected in the following aspects.
ideal mathematical models are often not achieved overnight. They are finally abstracted and summarized through trial and error and testing, forming mathematical models widely used in the field of engineering technology. For example, the British economist Malthus put forward the population model of geometric growth in 1798. The Dutch biologist veruler developed the Malthusian model in the mid-19th century and put forward a blocking growth model to more objectively describe the change law of population and the number of many species. Now, in order to use the population theory to study the spread of infectious diseases, infectious disease researchers have further developed veruler's results, which makes this theory further developed.
the modeling and solving process requires students to comprehensively use mathematical knowledge, computer programming, information search and other processes to obtain the solution of the mathematical model. It is the process with the most traditional mathematical flavor in the whole mathematical modeling process. Due to the time limit of mathematical modeling practice homework or competition, students should strike a balance between speed and accuracy. They are too slow to complete the project within the specified time. If they blindly pursue speed, they may make mistakes in the middle, resulting in all modeling processes starting from scratch. Through the solving process of mathematical modeling, students can realize the truth that haste makes waste. They can promote the progress of the project step by step and cultivate their patience and perseverance in this process.
Development trend
gray prediction
Program code
from decimal import * class GM11(): def __init__(self): self.f = None def isUsable(self, X0): '''Judge whether it passes the smoothness test''' # Condition judgment and circulation # Xiaobian Penguin 917267119 X1 = X0.cumsum() rho = [X0[i] / X1[i - 1] for i in range(1, len(X0))] rho_ratio = [rho[i + 1] / rho[i] for i in range(len(rho) - 1)] print("rho:", rho) print("rho_ratio:", rho_ratio) flag = True for i in range(2, len(rho) - 1): if rho[i] > 0.5 or rho[i + 1] / rho[i] >= 1: flag = False if rho[-1] > 0.5: flag = False if flag: print("Data pass smooth check") else: print("The data failed the smooth check") '''Judge whether it passes the level ratio test''' lambds = [X0[i - 1] / X0[i] for i in range(1, len(X0))] X_min = np.e ** (-2 / (len(X0) + 1)) X_max = np.e ** (2 / (len(X0) + 1)) for lambd in lambds: if lambd < X_min or lambd > X_max: print('The data failed the level ratio test') return print('The data passed the level ratio test') def train(self, X0): X1 = X0.cumsum() Z = (np.array([-0.5 * (X1[k - 1] + X1[k]) for k in range(1, len(X1))])).reshape(len(X1) - 1, 1) # Data matrix A, B A = (X0[1:]).reshape(len(Z), 1) B = np.hstack((Z, np.ones(len(Z)).reshape(len(Z), 1))) # Grey parameter calculation a, u = np.linalg.inv(np.matmul(B.T, B)).dot(B.T).dot(A) u = Decimal(u[0]) a = Decimal(a[0]) print("Grey parameter a: ", a, ",Grey parameter u: ", u) self.f = lambda k: (Decimal(X0[0]) - u / a) * np.exp(-a * k) + u / a def predict(self, k): X1_hat = [float(self.f(k)) for k in range(k)] X0_hat = np.diff(X1_hat) X0_hat = np.hstack((X1_hat[0], X0_hat)) return X0_hat def evaluate(self, X0_hat, X0): ''' The prediction results are judged according to the posterior error ratio and small error probability :param X0_hat: Prediction results :return: ''' S1 = np.std(X0, ddof=1) # Standard deviation of original data samples S2 = np.std(X0 - X0_hat, ddof=1) # Standard deviation of residual data samples C = S2 / S1 # Posterior error ratio Pe = np.mean(X0 - X0_hat) temp = np.abs((X0 - X0_hat - Pe)) < 0.6745 * S1 p = np.count_nonzero(temp) / len(X0) # Calculate small error probability print("Standard deviation of original data sample:", S1) print("Standard deviation of residual sample:", S2) print("Posterior error ratio:", C) print("Small error probability p: ", p) if __name__ == '__main__': import matplotlib.pyplot as plt import numpy as np plt.rcParams['font.sans-serif'] = ['SimHei'] # Step 1 (replace sans serif font) plt.rcParams['axes.unicode_minus'] = False # Step 2 (solve the problem of displaying the negative sign of the negative number of the coordinate axis) # Raw data X X = np.array( [21.2, 22.7, 24.36, 26.22, 28.18, 30.16, 32.34, 34.72, 37.3, 40.34, 44.08, 47.92, 51.96, 56.02, 60.14, 64.58, 68.92, 73.36, 78.98, 86.6]) # Training set X_train = X[:int(len(X) * 0.7)] # Test set X_test = X[int(len(X) * 0.7):] model = GM11() model.isUsable(X_train) # Judge the feasibility of the model model.train(X_train) # train Y_pred = model.predict(len(X)) # forecast Y_train_pred = Y_pred[:len(X_train)] Y_test_pred = Y_pred[len(X_train):] score_test = model.evaluate(Y_test_pred, X_test) # assessment # visualization plt.grid() plt.plot(np.arange(len(X_train)), X_train, '->') plt.plot(np.arange(len(X_train)), Y_train_pred, '-o') plt.legend(['Actual number of lectures', 'Prediction value of grey prediction model']) plt.title('Training set') plt.show() plt.grid() plt.plot(np.arange(len(X_test)), X_test, '->') plt.plot(np.arange(len(X_test)), Y_test_pred, '-o') plt.legend(['Actual number of lectures', 'Prediction value of grey prediction model']) plt.title('Test set') plt.show()