机器学习: 线性代数 101 - Cloud-Native 产品级敏捷
source link: https://www.deva9.com/ml/2165/?
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
机器学习: 线性代数 101
linear_algebra_for_machine_learning_101
Linear Algebra: Scalar, Vectors, Matrices, Tensors¶
- Scalar: a single number or value.
- Vector: an 1-Dimension array of numbers, either in a row or in a column, identified by an index.
- Matrix: a 2-Dimensions array of numbers, where each elements is identified by two indeces. 4. Tensors: more than 2-Dimensions; In general, an array of numbers with a variable number of Dimensions is known as a tensor.
- If a 2-Dimensions Matrix has shape (i,j), a 3-Dimensions Tensor would have shape (k,i,j); the number of Matrix (i,j) is k.
- If a 2-Dimensions Matrix has shape (i,j), a 4-Dimensions Tensor would have shape (l,m,i,j); the number of Matrix (i,j) is (l,m).
import sys import numpy as np
Defining a scalar¶
x = 6 x
6
Defining a vector¶
x = np.array((1,2,3)) x
array([1, 2, 3])
print ('Vector Dimensions: {}'.format(x.shape)) print ('Vector size: {}'.format(x.size))
Vector Dimensions: (3,) Vector size: 3
Defining a matrix¶
x = np.array([[1,2,3],[4,5,6],[7,8,9]]) x
array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
print ('Matrix Dimensions: {}'.format(x.shape)) print ('Matrix size: {}'.format(x.size))
Matrix Dimensions: (3, 3) Matrix size: 9
Defining a matrix of a given dimension¶
x = np.ones((3,3)) x
array([[1., 1., 1.], [1., 1., 1.], [1., 1., 1.]])
x = np.ones((2,3,3,5)) x
array([[[[1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.]], [[1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.]], [[1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.]]], [[[1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.]], [[1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.]], [[1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.]]]])
print ('Tensor Dimensions: {}'.format(x.shape)) print ('Tensor size: {}'.format(x.size))
Tensor Dimensions: (2, 3, 3, 5) Tensor size: 90
Indexing¶
A = np.ones((5,5), dtype = np.int) A
array([[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]])
Indexing starts at 0¶
A[0,1] = 2 A
array([[1, 2, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]])
A[:,0] = 3 A
array([[3, 2, 1, 1, 1], [3, 1, 1, 1, 1], [3, 1, 1, 1, 1], [3, 1, 1, 1, 1], [3, 1, 1, 1, 1]])
A[:,:] = 5 A
array([[5, 5, 5, 5, 5], [5, 5, 5, 5, 5], [5, 5, 5, 5, 5], [5, 5, 5, 5, 5], [5, 5, 5, 5, 5]])
2-Dimensions Matrix (5,5), the number of 2-Dimensions Matrix (5,5) is 6¶
A = np.ones((6,5,5), dtype = np.int) A
array([[[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]]])
For higher dimensions, simply add an index; Assign first row a new value¶
A[:,0,0] = 3 A
array([[[3, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[3, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[3, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[3, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[3, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]], [[3, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [1, 1, 1, 1, 1]]])
Matrix operation¶
A = np.array([[1,2], [3,4]]) print(A) print ('Matrix Dimensions: {}'.format(A.shape)) print ('Matrix size: {}'.format(A.size))
[[1 2] [3 4]] Matrix Dimensions: (2, 2) Matrix size: 4
B = np.ones((2,2), dtype = np.int) print(B) print ('Matrix Dimensions: {}'.format(B.shape)) print ('Matrix size: {}'.format(B.size))
[[1 1] [1 1]] Matrix Dimensions: (2, 2) Matrix size: 4
Element wise sum¶
C = A + B print(C) print ('Matrix Dimensions: {}'.format(C.shape)) print ('Matrix size: {}'.format(C.size))
[[2 3] [4 5]] Matrix Dimensions: (2, 2) Matrix size: 4
Element wise subtraction¶
C = A - B print(C) print ('Matrix Dimensions: {}'.format(C.shape)) print ('Matrix size: {}'.format(C.size))
[[0 1] [2 3]] Matrix Dimensions: (2, 2) Matrix size: 4
Element wise multiplication¶
C = np.dot(A, B) print(C) print ('Matrix Dimensions: {}'.format(C.shape)) print ('Matrix size: {}'.format(C.size))
[[3 3] [7 7]] Matrix Dimensions: (2, 2) Matrix size: 4
Matrix transpose¶
# matrix transpose A = np.array(range(9)) A = A.reshape(3,3) print(A) print ('Matrix Dimensions: {}'.format(A.shape)) print ('Matrix size: {}'.format(A.size))
[[0 1 2] [3 4 5] [6 7 8]] Matrix Dimensions: (3, 3) Matrix size: 9
B = A.T print(B) print ('Matrix Dimensions: {}'.format(B.shape)) print ('Matrix size: {}'.format(B.size))
[[0 3 6] [1 4 7] [2 5 8]] Matrix Dimensions: (3, 3) Matrix size: 9
C = B.T print(C) print ('Matrix Dimensions: {}'.format(C.shape)) print ('Matrix size: {}'.format(C.size))
[[0 1 2] [3 4 5] [6 7 8]] Matrix Dimensions: (3, 3) Matrix size: 9
A = np.array(range(10)) A = A.reshape(2,5) print(A) print ('Matrix Dimensions: {}'.format(A.shape)) print ('Matrix size: {}'.format(A.size))
[[0 1 2 3 4] [5 6 7 8 9]] Matrix Dimensions: (2, 5) Matrix size: 10
B = A.T print(B) print ('Matrix Dimensions: {}'.format(B.shape)) print ('Matrix size: {}'.format(B.size))
[[0 5] [1 6] [2 7] [3 8] [4 9]] Matrix Dimensions: (5, 2) Matrix size: 10
Tensor
# tensor A = np.ones((3,3,3,3,3,3,3,3,3,3), dtype = np.int) print ('Matrix Dimensions: {}'.format(A.shape)) print ('Matrix size: {}'.format(A.size))
Matrix Dimensions: (3, 3, 3, 3, 3, 3, 3, 3, 3, 3) Matrix size: 59049
此站点使用Akismet来减少垃圾评论。了解我们如何处理您的评论数据。
方俊贤; Ken Fang; A9 Atlas 工作室
2019, 08. 一种深度学习的算法, 预测微服务持续部署、持续发布后对产品整体质量的影响, 获得国家知识财产局专利; 符合专利法实施细则第 44 条的规定。
专利号: 201910652769.4
主要专长: 运用深度学习的算法模型分析开发者的行为, 以提升软件开发的效率与质量、微服务化的架构设计、探索性测试、有价值的产品特性挖掘、使用者行为 (场景) 分析、领域驱动设计。
曾任职于: 腾讯科技 (深圳) 有限公司 专家项目经理; 雅各布森软件 (北京) 有限公司 首席谘询顾问; Rational; Telelogic; Borland; 联华电子; 京元电子。
有二十多年半导体、 电信产业、军事研究单位与互联网的产品研发与咨询服务等的经验。
于 Illinois Institute of Technology, Chicago, USA 获得电子计算机科学硕士
通过电子邮件订阅博客
输入您的电子邮件地址订阅此博客,并通过电子邮件接收博客更新通知。
电子邮件地址
Copyright All rights reserved Theme: Blog Expert by Themeinwp
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK