Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Project dependencies may have API risk issues #46

Open
PyDeps opened this issue Oct 26, 2022 · 0 comments
Open

Project dependencies may have API risk issues #46

PyDeps opened this issue Oct 26, 2022 · 0 comments

Comments

@PyDeps
Copy link

PyDeps commented Oct 26, 2022

Hi, In MobileStyleGAN.pytorch, inappropriate dependency versioning constraints can cause risks.

Below are the dependencies and version constraints that the project is using

wheel
torch
pytorch-lightning==1.0.2
gdown==3.12.2
addict==2.2.1
piq==0.5.2
numpy==1.17.5
PyWavelets==1.1.1
git+://github.com/fbcotter/pytorch_wavelets.git
neptune-client==0.4.132
kornia==0.4.1
pytorch_fid
coremltools

The version constraint == will introduce the risk of dependency conflicts because the scope of dependencies is too strict.
The version constraint No Upper Bound and * will introduce the risk of the missing API Error because the latest version of the dependencies may remove some APIs.

After further analysis, in this project,
The version constraint of dependency pytorch-lightning can be changed to >=0.3.6.9,<=0.5.2.1.
The version constraint of dependency gdown can be changed to >=3.7.0,<=4.5.1.
The version constraint of dependency kornia can be changed to >=0.2.1,<=0.6.2.

The above modification suggestions can reduce the dependency conflicts as much as possible,
and introduce the latest version as much as possible without calling Error in the projects.

The invocation of the current project includes all the following methods.

The calling methods from the pytorch-lightning
pytorch_lightning.callbacks.ModelCheckpoint
pytorch_lightning.Trainer.fit
pytorch_lightning.Trainer
The calling methods from the gdown
gdown.cached_download
The calling methods from the kornia
kornia.augmentation.RandomHorizontalFlip
kornia.augmentation.RandomAffine
kornia.augmentation.RandomErasing
The calling methods from the all methods
json.dump
UpFirDn2dBackward.apply
torch.no_grad
torch.rsqrt.view
core.models.discriminator.Discriminator
real_pred.F.softplus.mean
SynthesisBlock
self.idwt.size
cfg.type.pl_loggers.getattr
fid_inception_v3
Wrapper
torch.unbind
ToRGB
self.perceptual_loss
distiller
prep_filt_sfb2d
pred.self.inception.view
self.branch3x3dbl_1
self.dwt_to_img.size
R1Regularization
torch.nn.Parameter
t.add_
json.load
torch.nn.functional.max_pool2d
FIDInceptionE_2
self.modulation
j.img_s.cpu
core.models.mobile_synthesis_network.MobileSynthesisNetwork
ConstantInput
gradgrad_input.reshape.reshape
x_pred.sum
style_b.unsqueeze.repeat.unsqueeze
cv2.imwrite
style.view.unsqueeze
self.layers
idwt.DWTInverse
size.size.b.torch.randn.to
k.source_state.size
torch.nn.functional.conv_transpose2d
self.branch7x7dbl_4
self.FIDInceptionC.super.__init__
torch.tensor
modules.DWTInverse
self.loss.reg_d
range
self.branch3x3_1
self.get_demodulation.view
core.models.mapping_network.MappingNetwork
torch.tensor.sum
style_a.unsqueeze.repeat
torch.load
self.mapping_net.apply
FIDInceptionC
random.randint
cv2.waitKey
self.branch5x5_1
t.mul.add_.clamp_
UpFirDn2dBackward.apply.view
pytorch_wavelets.DWTInverse
torch.utils.data.DataLoader
self.get_demodulation
torch.utils.cpp_extension.load.fused_bias_act
gdown.cached_download
ResBlock
pytorch_lightning.callbacks.ModelCheckpoint
core.models.mapping_network.MappingNetwork.state_dict
high.view.view
torch.autograd.grad.size
torch.onnx.export
self.weight.unsqueeze
t.mul.add_.clamp_.permute
pred.squeeze.squeeze.cpu.numpy.size
self.net._modules.items
image.new_empty
distiller.size
self.to_img1
numpy.iscomplexobj
EqualConv2d
self.to_img1.view
img_t.cpu
block.conv2.load_state_dict
self.parameters
enumerate
self.branch3x3dbl_3b
self.synthesis_net.append
Blur
core.loss.perceptual_loss.PerceptualLoss
scipy.linalg.sqrtm
modules.legacy.PixelNorm.append
getattr
self.noise
torch.nn.LeakyReLU
torch.cat
grad_x.size.grad_x.view.norm
core.models.synthesis_network.SynthesisNetwork
torch.nn.ModuleList
self.student
int_to_mode
FIDInceptionE_1
torch.uint8.img.to.numpy
style_a.unsqueeze.repeat.unsqueeze
gradgrad_out.view.view
covmean.np.isfinite.all
format
UpFirDn2d.apply
conv_module
build_logger
sorted
torch.cat.view
core.utils.select_weights.size
torch.nn.BatchNorm2d
upfirdn2d_native
stddev.mean.squeeze
torch.nn.BatchNorm2d.train.to
numpy.isfinite
FusedLeakyReLU
torch.nn.functional.conv2d.size
input.reshape.reshape
block.to_rgb.load_state_dict
core.utils.download_ckpt
ctx.save_for_backward
self.branch7x7dbl_1
style.view.size
self.register_buffer
self.dwt
stddev.repeat.var
self.mapping_net
self.branch7x7dbl_2
cv2.imshow
core.distiller.Distiller.to_onnx
multiprocessing.cpu_count
ValueError
torch.nn.AdaptiveAvgPool2d
self.mapping_net.style_dim.self.wsize.torch.randn.to
weight.transpose.reshape.pow
hasattr
self.generator_step
EqualLinear
StyledConv2d
noise_injection.NoiseInjection
batch.to.to
torch.rsqrt
core.utils.tensor_to_img
torchvision.models.inception_v3.load_state_dict
self.net
bias.view
self.weight_dw.transpose
core.distiller.Distiller.to_coreml
self.student.append
self.branch3x3dbl_2
core.utils.save_cfg
distiller.cpu
torch.save
self.branch3x3_2a
torch.optim.Adam
StyledConv
out.view.view
offset.sigma1.dot
torch.cat.sigmoid
stddev.repeat.mean
modules.MultichannelIamge
img.size
core.utils.apply_trace_model_mode
block_idx.InceptionV3.to.eval
self.final_linear
self.branch7x7_3
self.get_modulation
opts.append
self.loss_weights.items
t.mul
self.branch1x1
torch.zeros
torch.nn.L1Loss
extract_snet
torch.nn.functional.interpolate
torchvision.transforms.ToTensor
pathlib.Path.glob
batch.self.mapping_net.unsqueeze.repeat
self.blocks.append
torch.autograd.grad
var.self.mapping_net.view
img.view.size
self.modulation.bias.data.fill_
self
self.conv.view
stddev.repeat.repeat
style.self.modulation.view
self.l1_loss
torch.nn.functional.conv2d
fake.self.F.softplus.mean
v.size
self.branch5x5_2
self.make_sample
target.state_dict.items
diff.dot
hidden.size.hidden.size.torch.randn.to
self.branch3x3dbl_3
math.log
self.branch3x3dbl_3a
chr
self.FIDInceptionE_1.super.__init__
mnet.layers.load_state_dict
upfirdn2d
range.t.add_.div_
torch.nn.Sequential
self.log
kernel.torch.flip.view
out.permute.permute
self.mapping_net.style_dim.torch.randn.to
RuntimeError
core.models.synthesis_network.SynthesisNetwork.state_dict
torch.utils.model_zoo.load_url
numpy.load
self.to_img1.size
weight.transpose.reshape.transpose
torch.nn.BatchNorm2d.train.to.eval
torch.utils.cpp_extension.load.upfirdn2d
self.mapping_net.style_dim.torch.randn.self.mapping_net.mean
pathlib.Path.endswith
tqdm.tqdm
snet.conv1.load_state_dict
FusedLeakyReLUFunction.apply
MultichannelIamge
torch.nn.functional.leaky_relu
model
torch.uint8.img.to.numpy.to
int
pred.squeeze.squeeze.cpu.numpy.squeeze
torch.nn.functional.adaptive_avg_pool2d
mapping_net_ckpt.MappingNetwork.eval
core.model_zoo.model_zoo
self._log_loss
block.conv1.load_state_dict
block
self.InceptionV3.super.__init__
core.utils.load_weights
coremltools.TensorType
self.loss.loss_d
img.view.view
i.snet.layers.load_state_dict
self.branch7x7_2
ConvLayer
fake_pred.F.softplus.mean
self.layers.append
ScaledLeakyReLU
target.load_state_dict
os.path.exists
grad_x.size.grad_x.view.norm.mean
torch.nn.functional.avg_pool2d
calculate_frechet_distance
numpy.atleast_2d.dot
self.student.apply
block_idx.InceptionV3.to
pytorch_lightning.Trainer.fit
core.utils.select_weights
FIDInceptionA
pred.squeeze.squeeze.cpu
numpy.abs
self.upsample
self.activate
_SFB2D
self.to_img
self.up
self.loss.reg_d.items
pytorch_lightning.Trainer
style.self.modulation.view.size
torch.load.items
self.branch3x3_2b
torch.autograd.grad.view
torch.stack
self.FIDInceptionA.super.__init__
torchvision.models.inception_v3
self.to_rgb1
outp.append
numpy.trace
out_dim.torch.zeros.fill_
synthesis_net_ckpt.SynthesisNetwork.eval
isinstance
numpy.mean
torch.cat.size
grad_output.reshape.reshape
numpy.atleast_1d
w.self.style_inv.self.scale.pow
addict.Dict
piq.KID
self.minibatch_discrimination
weight.pow.sum
make_kernel
m.wsize
numpy.atleast_2d
input.size
style_b.unsqueeze.repeat
gt.self.inception.view
fused.fused_bias_act.sum
self.img_to_dwt
self.synthesis_net
channels.append
real.detach
torch.device
distiller.to.to
torch.utils.cpp_extension.load
Upsample
self.r1_reg
print
arch.models.getattr
numpy.concatenate.append
numpy.max
self.compute_mean_style
ImagePathDataset
torch.nn.functional.linear
style.unsqueeze.repeat
shape.torch.randn.to
numpy.allclose
pywt.Wavelet
self.mapping_net.style_dim.self.wsize.torch.randn.to.to
pytorch_wavelets.DWTForward
modules.StyledConv2d
self.synthesis_net.load_state_dict
norm
self.branch7x7dbl_5
modules.legacy.EqualLinear
self.conv1
IDWTUpsaplme
out.permute.view
sfb1d
core.loss.distiller_loss.DistillerLoss
width.height.batch.image.new_empty.normal_
path.Image.open.convert
list
super
core.utils.load_cfg
collections.OrderedDict
outputs.x.x.torch.stack.mean
numpy.diagonal
dim.grad_input.sum.detach
self.convs
snet.to_rgb1.load_state_dict
torch.nn.functional.l1_loss
os.path.dirname
target.state_dict
self.transforms
super.__init__
math.sqrt
create_config
numpy.cov
pred.squeeze.squeeze
make_style
torch.jit.trace
out.permute.reshape
self.mapping_net.load_state_dict
self.input
self.mapping_net.style_dim.self.cfg.batch_size.torch.randn.to
blocks.append
ModulatedConv2d
self.input.repeat
torchvision.transforms.Compose
PerceptualNetwork
t.mul.add_
self.m
pathlib.Path
torch.sqrt
self.student.wsize
self.conv
main
torchvision.transforms.Resize
self.layers.wsize
self.discriminator_step
self.cfg.mode.split
fused_leaky_relu
kornia.augmentation.RandomErasing
self.branch_pool
out.append
m
k.startswith
input.reshape.view
weight.transpose.reshape
mode_to_int
torch.nn.Linear
self.branch7x7_1
diffaug.get_default_transforms
NoiseInjection
FusedLeakyReLUFunctionBackward.apply
modulated_conv2d.ModulatedConv2d
k.replace
core.distiller.Distiller.simultaneous_forward
torch.cuda.is_available
img.view
self._resize
self.final_conv
self.weight_permute.self.weight_dw.transpose.unsqueeze
self.get_demodulation.size
torch.flip
gt.self._resize.detach
modules.legacy.PixelNorm
self.dwt_to_img
self.net.items
self.FIDInceptionE_2.super.__init__
torch.nn.functional.pad
self.kid.compute_metric
torch.nn.MaxPool2d
os.path.join
img_s.cpu
self.idwt
argparse.ArgumentParser.add_argument
coremltools.convert.save
open
modules.ConstantInput
SFB2D.apply
self.gan_loss.loss_g
weight.transpose.reshape.view
self.inception
numpy.hstack
input.new_empty
coremltools.convert
grad_output.new_empty
max
pred.squeeze.squeeze.cpu.numpy
argparse.ArgumentParser
torch.randn
kornia.augmentation.RandomAffine
self.conv2
torch.nn.MSELoss
len
style.view.view
self.student.parameters
extract_mnet
self.l2_loss
i.noise.size
kornia.augmentation.RandomHorizontalFlip
core.models.inception_v3.load_inception_v3
core.dataset.NoiseDataset
self.to_rgb
utils.NoiseManager
ConvLayer.append
self.blur
i.blocks.state_dict
self.branch7x7dbl_3
in_dim.out_dim.torch.randn.div_
self.loss.gan_loss.parameters
t.clamp_
pytorch_fid.inception.InceptionV3
PIL.Image.open
core.loss.non_saturating_gan_loss.NonSaturatingGANLoss
random.random
fake.detach
numpy.eye
noise
argparse.ArgumentParser.parse_args
input.view.view
self.conv1.size
core.models.synthesis_network.SynthesisBlock
core.distiller.Distiller
distiller.mapping_net.style_dim.args.batch_size.torch.randn.to
batch.self.mapping_net.unsqueeze
w.self.style_inv.self.scale.pow.sum
torch.nn.functional.softplus
calculate_fid_given_paths
InceptionV3
self.loss.loss_g
torch.mean
self.gan_loss.loss_d
modules.MobileSynthesisBlock
calculate_activation_statistics
min
numpy.concatenate
compute_statistics_of_path
self.act
self.skip
torch.nn.BatchNorm2d.train
layers.append
self.gan_loss.reg_d
snet.input.load_state_dict
pred.detach
get_activations
os.getcwd

@developer
Could please help me check this issue?
May I pull a request to fix it?
Thank you very much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant