DataPredict™ Neural [Release 1.12] (Mature + Maintenance Mode) - Deep Learning Library - 45+ Models + Recurrent Deep Reinforcement Learning!

Release Version 1.11 / Beta Version 1.7.0

Added

  • Added all the recurrent version of deep reinforcement learning models from the “Models” section to the “RecurrentModels” section. The models include:

    • RecurrentVanillaPolicyGradient

    • RecurrentActorCritic

    • RecurrentAdvantageActorCritic

    • RecurrentSoftActorCritic

    • RecurrentProximalPolicyOptimization

    • RecurrentProximalPolicyOptimizationClip

    • RecurrentDeepDeterministicPolicyGradient

    • RecurrentTwinDelayedDeepDeterministicPolicyGradient

    • RecurrentREINFORCE

    • RecurrentMonteCarloControl

    • RecurrentOffPolicyMonteCarloControl

    • RecurrentDeepQLearning

    • RecurrentDeepStateActionRewardStateAction

    • RecurrentDeepExpectedStateActionRewardStateAction

    • RecurrentDeepClippedDoubleQLearning

    • RecurrentDeepDoubleQLearningV1

    • RecurrentDeepDoubleQLearningV2

    • RecurrentDeepDoubleStateActionRewardStateActionV1

    • RecurrentDeepDoubleStateActionRewardStateActionV2

    • RecurrentDeepDoubleExpectedStateActionRewardStateActionV1

    • RecurrentDeepDoubleExpectedStateActionRewardStateActionV2

Changes

  • Refactored all the codes under the “Container” section.

  • Refactored IterativeTrainingWrapper codes under the “Utilities” section.

  • Improved the first derivative tensor calculations for ConstantPadding, ReplicationPadding and ReflectionPadding under the “PaddingBlocks” section.

1 Like

FANTASTIC!! This text will be blurred

Just wait until you see the next generation for this library. Though, not all features like the reinforcement learning models included for the Beta version.

1 Like

Release 1.12 / Beta 1.8.0

Added

  • BaseWeightBlock now has the gradientAscent() function.

Changes

  • BaseWeightBlock now has the ability to perform in-place weight tensor updates and this behaviour is set to default. By setting this to default, the models are now able to train without creating additional tables and this improves the performance.