You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+9-11Lines changed: 9 additions & 11 deletions
Original file line number
Diff line number
Diff line change
@@ -113,6 +113,9 @@ remaining useful life (RUL) tasks by combining partially and fully monotonic
113
113
networks. This example looks at predicting the RUL for turbofan engine
114
114
degradation.
115
115
116
+
-[Battery State of Charge Estimation Using Monotonic Neural Networks](examples/monotonic/BSOCEstimateUsingMonotonicNetworks/BatteryStateOfChargeEstimationUsingMonotonicNeuralNetworks.md)
117
+
This example shows how to train two monotonic neural networks to estimate the state of charge (SOC) of a battery, one to model the charging behavior, and one to model the discharging behavior. In this example, you train the networks to predict the rate of change of the state of charge and force the output to be positive or negative for the charging and discharging networks, respectively. This way, you enforce monotonicity of the battery state of charge by constraining its derivative to be positive or negative.
118
+
116
119
-[Train Image Classification Lipschitz Constrained Networks and Measure
@@ -125,17 +128,12 @@ more robust classification network.
125
128
126
129
## Functions
127
130
128
-
This repository introduces the following functions that are used throughout the
129
-
examples:
130
-
131
-
-[`buildConstrainedNetwork`](conslearn/buildConstrainedNetwork.m) - Build a multi-layer perceptron (MLP) with constraints on the architecture and initialization of the weights.
132
-
-[`buildConvexCNN`](conslearn/buildConvexCNN.m) - Build a fully-inpt convex convolutional neural network (CNN).
133
-
-[`trainConstrainedNetwork`](conslearn/trainConstrainedNetwork.m) - Train a
134
-
constrained network and maintain the constraint during training.
135
-
-[`lipschitzUpperBound`](conslearn/lipschitzUpperBound.m) - Compute an upper
136
-
bound on the Lipschitz constant for a Lipschitz neural network.
guaranteed upper and lower bounds on hypercubic grids for convex networks.
131
+
This repository introduces the following functions that are used throughout the examples:
132
+
-[`buildConstrainedNetwork`](conslearn/buildConstrainedNetwork.m) - Build a multi-layer perceptron (MLP) with specific constraints on the architecture and initialization of the weights.
133
+
-[`buildConvexCNN`](conslearn/buildConvexCNN.m) - Build a convolutional neural network (CNN) with convex constraints on the architecture and initialization of the weights.
134
+
-[`trainConstrainedNetwork`](conslearn/trainConstrainedNetwork.m) - Train a constrained network and maintain the constraint during training.
135
+
-[`lipschitzUpperBound`](conslearn/lipschitzUpperBound.m) - Compute an upper bound on the Lipschitz constant for a Lipschitz neural network.
136
+
-[`convexNetworkOutputBounds`](conslearn/convexNetworkOutputBounds.m) - Compute guaranteed upper and lower bounds on hypercubic grids for convex networks.
Copy file name to clipboardExpand all lines: examples/monotonic/BSOCEstimateUsingMonotonicNetworks/BatteryStateOfChargeEstimationUsingMonotonicNeuralNetworks.md
0 commit comments