Initializers
RecIS provides multiple parameter initializers:
TruncNormalInitializer
- class recis.nn.initializers.TruncNormalInitializer(mean: float = 0.0, std: float = 1.0, a: float = -2.0, b: float = 2.0, generator: Generator = None)[source]
- Truncated normal distribution initializer. - Initializes parameters by sampling from a truncated normal distribution. This is similar to normal initialization but with values outside the specified range [a, b] resampled to ensure all values fall within bounds. This can be useful when you want normally distributed values but need to avoid extreme outliers. - Parameters:
- mean (float, optional) – Mean of the normal distribution. Defaults to 0.0. 
- std (float, optional) – Standard deviation of the normal distribution. Defaults to 1.0. 
- a (float, optional) – Lower truncation bound in units of standard deviations from the mean. Defaults to -2.0. 
- b (float, optional) – Upper truncation bound in units of standard deviations from the mean. Defaults to 2.0. 
- generator (torch.Generator, optional) – Random number generator for reproducible initialization. Defaults to None. 
 
 - Example - >>> # Truncated normal with small std and tight bounds >>> initializer = TruncNormalInitializer(mean=0.0, std=0.02, a=-2.0, b=2.0) >>> initializer.set_shape([512, 256]) >>> initializer.build() >>> tensor = initializer.generate() # Values in [-0.04, 0.04] 
ConstantInitializer
- class recis.nn.initializers.ConstantInitializer(init_val: float = 0.0, dtype: dtype = torch.float32)[source]
- Constant value initializer. - Initializes all parameters with the same constant value. This is useful for bias initialization or when you want all parameters to start with the same value. - Parameters:
- init_val (float, optional) – The constant value to initialize parameters. Defaults to 0.0. 
- dtype (float, torch.dtype) – The value type to initialize parameters. Defaults to torch.float32. 
 
 - Example - >>> # Initialize all parameters to 0.1 >>> initializer = ConstantInitializer(init_val=0.1) >>> initializer.set_shape([10, 5]) >>> initializer.build() >>> tensor = initializer.generate() # All values will be 0.1 
UniformInitializer
- class recis.nn.initializers.UniformInitializer(a: float = 0.0, b: float = 1.0, generator: Generator = None)[source]
- Uniform distribution initializer. - Initializes parameters by sampling from a uniform distribution over the interval [a, b). This provides a simple way to initialize parameters with values spread evenly across a specified range. - Parameters:
- a (float, optional) – Lower bound of the uniform distribution (inclusive). Defaults to 0.0. 
- b (float, optional) – Upper bound of the uniform distribution (exclusive). Defaults to 1.0. 
- generator (torch.Generator, optional) – Random number generator for reproducible initialization. Defaults to None. 
 
 - Example - >>> # Initialize parameters uniformly between -0.1 and 0.1 >>> initializer = UniformInitializer(a=-0.1, b=0.1) >>> initializer.set_shape([100, 50]) >>> initializer.build() >>> tensor = initializer.generate() 
NormalInitializer
- class recis.nn.initializers.NormalInitializer(mean: float = 0.0, std: float = 1.0, generator: Generator = None)[source]
- Normal (Gaussian) distribution initializer. - Initializes parameters by sampling from a normal distribution with specified mean and standard deviation. This is one of the most common initialization strategies for neural networks. - Parameters:
- mean (float, optional) – Mean of the normal distribution. Defaults to 0.0. 
- std (float, optional) – Standard deviation of the normal distribution. Defaults to 1.0. 
- generator (torch.Generator, optional) – Random number generator for reproducible initialization. Defaults to None. 
 
 - Example - >>> # Initialize with small random values around zero >>> initializer = NormalInitializer(mean=0.0, std=0.01) >>> initializer.set_shape([784, 128]) >>> initializer.build() >>> tensor = initializer.generate()