Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make pyproject.toml a selectable option when creating environments #22010

Closed
ackalker opened this issue Sep 16, 2023 · 2 comments
Closed

Make pyproject.toml a selectable option when creating environments #22010

ackalker opened this issue Sep 16, 2023 · 2 comments
Labels
area-environments Features relating to handling interpreter environments feature-request Request for new features or functionality needs PR Ready to be worked on

Comments

@ackalker
Copy link

ackalker commented Sep 16, 2023

Type: Bug

Behaviour

Expected vs. Actual

Expected: Leaving all dependency sources (requirements.txt, pyproject.toml) unchecked should not build and install any dependencies from these files.
Actual: Dependencies in pyproject.toml are built and installed during the creation of the virtual environment.

Steps to reproduce:

  1. Clone the repository https://github.com/abetlen/llama-cpp-python (with submodules!) and open it in Visual Studio Code.
  2. Open the Command Palette and run Python: Create environment
  3. Choose to create a .venv type environment.
  4. In the dialogs that follow, make sure that none of the requirements.txt or pyproject.toml files are selected for installing dependencies, e.g. hit Enter while keyboard focus is in the topmost edit box, without any of the checkboxes checked.

Note that I had to run these steps twice to even get a working Python virtual environment, because installing the dependencies from pyproject.toml forced me to install some tools which I didn't have installed at the time (because I don't need them for what I want to do (edit and lint Python code)) before I could retry creating the virtual environment.

Diagnostic data

  • Python version (& distribution if applicable, e.g. Anaconda): 3.10.12
  • Type of virtual environment used (e.g. conda, venv, virtualenv, etc.): Venv
  • Value of the python.languageServer setting: Default
Output for Python in the Output panel (ViewOutput, change the drop-down the upper-right of the Output panel to Python)

2023-09-16 15:56:39.738 [info] Experiment 'pythonPromptNewFormatterExt' is active
2023-09-16 15:56:39.738 [info] Experiment 'pythonPromptNewToolsExt' is active
2023-09-16 15:56:39.738 [info] Experiment 'pythonTerminalEnvVarActivation' is active
2023-09-16 15:56:39.738 [info] Experiment 'pythonTestAdapter' is active
2023-09-16 15:56:39.738 [info] Test server listening.
2023-09-16 15:56:39.738 [info] > conda info --json
2023-09-16 15:56:39.739 [info] Found: /bin/python3 --> /bin/python3.10
2023-09-16 15:56:39.739 [info] Found: /bin/python3.10 --> /bin/python3.10
2023-09-16 15:56:39.745 [info] Found: /usr/bin/python3 --> /usr/bin/python3.10
2023-09-16 15:56:39.745 [info] Found: /usr/bin/python3.10 --> /usr/bin/python3.10
2023-09-16 15:56:40.940 [info] > /bin/python3 -I ~/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/get_output_via_markers.py ~/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/interpreterInfo.py
2023-09-16 15:56:41.045 [info] > /usr/bin/python3 -I ~/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/get_output_via_markers.py ~/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/interpreterInfo.py
2023-09-16 15:56:41.118 [info] Python interpreter path: /bin/python3
2023-09-16 15:56:43.168 [info] Starting Pylance language server.
2023-09-16 15:58:57.288 [info] Selected workspace /mnt/d/git/llama-cpp-python for creating virtual environment.
2023-09-16 15:59:14.887 [info] Selected interpreter /bin/python3 for creating virtual environment.
2023-09-16 15:59:49.677 [info] Running Env creation script:  [
  '/bin/python3',
  '/home/miki/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/create_venv.py',
  '--git-ignore',
  '--toml',
  '/mnt/d/git/llama-cpp-python/pyproject.toml'
]
2023-09-16 15:59:49.718 [info] Running: /bin/python3 -m venv --without-pip .venv

2023-09-16 15:59:50.284 [info] CREATED_VENV:/mnt/d/git/llama-cpp-python/.venv/bin/python

2023-09-16 15:59:50.284 [info] Creating: /mnt/d/git/llama-cpp-python/.venv/.gitignore

2023-09-16 15:59:50.318 [info] CREATE_VENV.DOWNLOADING_PIP

2023-09-16 15:59:50.643 [info] CREATE_VENV.INSTALLING_PIP
Running: /mnt/d/git/llama-cpp-python/.venv/bin/python /mnt/d/git/llama-cpp-python/.venv/pip.pyz install pip

2023-09-16 15:59:54.912 [info] Collecting pip

2023-09-16 15:59:54.912 [info]   Obtaining dependency information for pip from https://files.pythonhosted.org/packages/50/c2/e06851e8cc28dcad7c155f4753da8833ac06a5c704c109313b8d5a62968a/pip-23.2.1-py3-none-any.whl.metadata

2023-09-16 15:59:55.008 [info]   Downloading pip-23.2.1-py3-none-any.whl.metadata (4.2 kB)

2023-09-16 15:59:55.024 [info] Downloading pip-23.2.1-py3-none-any.whl (2.1 MB)

2023-09-16 15:59:55.134 [info]    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.1/2.1 MB 20.3 MB/s eta 0:00:00
2023-09-16 15:59:55.134 [info] 

2023-09-16 15:59:55.145 [info] Installing collected packages: pip

2023-09-16 16:00:51.997 [info] Successfully installed pip-23.2.1

2023-09-16 16:00:52.527 [info] VENV_INSTALLING_PYPROJECT: /mnt/d/git/llama-cpp-python/pyproject.toml
Running: /mnt/d/git/llama-cpp-python/.venv/bin/python -m pip install -e .

2023-09-16 16:01:04.076 [info] Obtaining file:///mnt/d/git/llama-cpp-python

2023-09-16 16:01:04.107 [info]   Installing build dependencies: started

2023-09-16 16:01:20.585 [info]   Installing build dependencies: finished with status 'done'

2023-09-16 16:01:20.586 [info]   Checking if build backend supports build_editable: started

2023-09-16 16:01:20.965 [info]   Checking if build backend supports build_editable: finished with status 'done'

2023-09-16 16:01:20.967 [info]   Getting requirements to build editable: started

2023-09-16 16:01:22.228 [info]   Getting requirements to build editable: finished with status 'done'

2023-09-16 16:01:22.264 [info]   Installing backend dependencies: started

2023-09-16 16:01:37.826 [info]   Installing backend dependencies: finished with status 'done'

2023-09-16 16:01:37.827 [info]   Preparing editable metadata (pyproject.toml): started

2023-09-16 16:01:38.908 [info]   Preparing editable metadata (pyproject.toml): finished with status 'done'

2023-09-16 16:01:39.497 [info] Collecting typing-extensions>=4.5.0 (from llama_cpp_python==0.2.4)

2023-09-16 16:01:39.497 [info]   Obtaining dependency information for typing-extensions>=4.5.0 from https://files.pythonhosted.org/packages/ec/6b/63cc3df74987c36fe26157ee12e09e8f9db4de771e0f3404263117e75b95/typing_extensions-4.7.1-py3-none-any.whl.metadata

2023-09-16 16:01:39.604 [info]   Downloading typing_extensions-4.7.1-py3-none-any.whl.metadata (3.1 kB)

2023-09-16 16:01:40.070 [info] Collecting numpy>=1.20.0 (from llama_cpp_python==0.2.4)

2023-09-16 16:01:40.070 [info]   Obtaining dependency information for numpy>=1.20.0 from https://files.pythonhosted.org/packages/71/3c/3b1981c6a1986adc9ee7db760c0c34ea5b14ac3da9ecfcf1ea2a4ec6c398/numpy-1.25.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata

2023-09-16 16:01:40.100 [info]   Downloading numpy-1.25.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.6 kB)

2023-09-16 16:01:40.399 [info] Collecting diskcache>=5.6.1 (from llama_cpp_python==0.2.4)

2023-09-16 16:01:40.399 [info]   Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata

2023-09-16 16:01:40.442 [info]   Downloading diskcache-5.6.3-py3-none-any.whl.metadata (20 kB)

2023-09-16 16:01:40.483 [info] Downloading diskcache-5.6.3-py3-none-any.whl (45 kB)

2023-09-16 16:01:40.497 [info]    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 45.5/45.5 kB 4.4 MB/s eta 0:00:00
2023-09-16 16:01:40.497 [info] 

2023-09-16 16:01:40.528 [info] Downloading numpy-1.25.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB)

2023-09-16 16:01:40.930 [info]    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 18.2/18.2 MB 67.9 MB/s eta 0:00:00
2023-09-16 16:01:40.930 [info] 

2023-09-16 16:01:40.965 [info] Downloading typing_extensions-4.7.1-py3-none-any.whl (33 kB)

2023-09-16 16:01:40.993 [info] Building wheels for collected packages: llama_cpp_python

2023-09-16 16:01:40.995 [info]   Building editable for llama_cpp_python (pyproject.toml): started

2023-09-16 16:01:47.571 [info]   Building editable for llama_cpp_python (pyproject.toml): finished with status 'error'

2023-09-16 16:01:47.576 [info]   error: subprocess-exited-with-error
  
  × Building editable for llama_cpp_python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [26 lines of output]
      *** scikit-build-core 0.5.0 using CMake 3.27.4 (editable)
      *** Configuring CMake...
      2023-09-16 16:01:42,035 - scikit_build_core - WARNING - libdir/ldlibrary: /usr/lib/x86_64-linux-gnu/libpython3.10.so is not a real file!
      2023-09-16 16:01:42,036 - scikit_build_core - WARNING - Can't find a Python library, got libdir=/usr/lib/x86_64-linux-gnu, ldlibrary=libpython3.10.so, multiarch=x86_64-linux-gnu, masd=x86_64-linux-gnu
      loading initial cache file /tmp/tmpiik4z0u5/build/CMakeInit.txt
      -- The C compiler identification is unknown
      -- The CXX compiler identification is unknown
      CMake Error at CMakeLists.txt:3 (project):
        No CMAKE_C_COMPILER could be found.
      
        Tell CMake where to find the compiler by setting either the environment
        variable "CC" or the CMake cache entry CMAKE_C_COMPILER to the full path to
        the compiler, or to the compiler name if it is in the PATH.
      
      
      CMake Error at CMakeLists.txt:3 (project):
        No CMAKE_CXX_COMPILER could be found.
      
        Tell CMake where to find the compiler by setting either the environment
        variable "CXX" or the CMake cache entry CMAKE_CXX_COMPILER to the full path
        to the compiler, or to the compiler name if it is in the PATH.
      
      
      -- Configuring incomplete, errors occurred!
      
      *** CMake configuration failed
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.

2023-09-16 16:01:47.576 [info]   ERROR: Failed building editable for llama_cpp_python

2023-09-16 16:01:47.576 [info] Failed to build llama_cpp_python

2023-09-16 16:01:47.577 [info] ERROR: Could not build wheels for llama_cpp_python, which is required to install pyproject.toml-based projects

2023-09-16 16:01:47.812 [info] Traceback (most recent call last):
  File "/home/miki/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/create_venv.py", line 84, in run_process

2023-09-16 16:01:47.812 [info]     subprocess.run(args, cwd=os.getcwd(), check=True)
  File "/usr/lib/python3.10/subprocess.py", line 526, in run

2023-09-16 16:01:47.813 [info]     raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['/mnt/d/git/llama-cpp-python/.venv/bin/python', '-m', 'pip', 'install', '-e', '.']' returned non-zero exit status 1.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/miki/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/create_venv.py", line 250, in <module>

2023-09-16 16:01:47.814 [info]     main(sys.argv[1:])
  File "/home/miki/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/create_venv.py", line 241, in main
    install_toml(venv_path, args.extras)
  File "/home/miki/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/create_venv.py", line 113, in install_toml
    run_process(
  File "/home/miki/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/create_venv.py", line 86, in run_process
    raise VenvError(error_message)
__main__.VenvError: CREATE_VENV.PIP_FAILED_INSTALL_PYPROJECT

2023-09-16 16:01:47.824 [error] Error while running venv creation script:  CREATE_VENV.PIP_FAILED_INSTALL_PYPROJECT
2023-09-16 16:01:47.824 [error] CREATE_VENV.PIP_FAILED_INSTALL_PYPROJECT
2023-09-16 16:11:09.598 [info] Selected workspace /mnt/d/git/llama-cpp-python for creating virtual environment.
2023-09-16 16:11:19.758 [info] Selected interpreter /bin/python3 for creating virtual environment.
2023-09-16 16:12:01.514 [info] Deleted venv dir: /mnt/d/git/llama-cpp-python/.venv
2023-09-16 16:12:01.523 [info] Running Env creation script:  [
  '/bin/python3',
  '/home/miki/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/create_venv.py',
  '--git-ignore',
  '--toml',
  '/mnt/d/git/llama-cpp-python/pyproject.toml'
]
2023-09-16 16:12:01.584 [info] Running: /bin/python3 -m venv --without-pip .venv

2023-09-16 16:12:01.783 [info] CREATED_VENV:/mnt/d/git/llama-cpp-python/.venv/bin/python

2023-09-16 16:12:01.783 [info] Creating: /mnt/d/git/llama-cpp-python/.venv/.gitignore

2023-09-16 16:12:01.908 [info] CREATE_VENV.DOWNLOADING_PIP

2023-09-16 16:12:02.011 [info] CREATE_VENV.INSTALLING_PIP
Running: /mnt/d/git/llama-cpp-python/.venv/bin/python /mnt/d/git/llama-cpp-python/.venv/pip.pyz install pip

2023-09-16 16:12:06.020 [info] Collecting pip

2023-09-16 16:12:06.020 [info]   Obtaining dependency information for pip from https://files.pythonhosted.org/packages/50/c2/e06851e8cc28dcad7c155f4753da8833ac06a5c704c109313b8d5a62968a/pip-23.2.1-py3-none-any.whl.metadata

2023-09-16 16:12:06.021 [info]   Using cached pip-23.2.1-py3-none-any.whl.metadata (4.2 kB)

2023-09-16 16:12:06.029 [info] Using cached pip-23.2.1-py3-none-any.whl (2.1 MB)

2023-09-16 16:12:06.039 [info] Installing collected packages: pip

2023-09-16 16:13:02.609 [info] Successfully installed pip-23.2.1

2023-09-16 16:13:02.822 [info] VENV_INSTALLING_PYPROJECT: /mnt/d/git/llama-cpp-python/pyproject.toml
Running: /mnt/d/git/llama-cpp-python/.venv/bin/python -m pip install -e .

2023-09-16 16:13:14.394 [info] Obtaining file:///mnt/d/git/llama-cpp-python

2023-09-16 16:13:14.433 [info]   Installing build dependencies: started

2023-09-16 16:13:30.261 [info]   Installing build dependencies: finished with status 'done'

2023-09-16 16:13:30.262 [info]   Checking if build backend supports build_editable: started

2023-09-16 16:13:30.610 [info]   Checking if build backend supports build_editable: finished with status 'done'

2023-09-16 16:13:30.611 [info]   Getting requirements to build editable: started

2023-09-16 16:13:31.879 [info]   Getting requirements to build editable: finished with status 'done'

2023-09-16 16:13:31.911 [info]   Installing backend dependencies: started

2023-09-16 16:13:46.620 [info]   Installing backend dependencies: finished with status 'done'

2023-09-16 16:13:46.620 [info]   Preparing editable metadata (pyproject.toml): started

2023-09-16 16:13:47.648 [info]   Preparing editable metadata (pyproject.toml): finished with status 'done'

2023-09-16 16:13:48.220 [info] Collecting typing-extensions>=4.5.0 (from llama_cpp_python==0.2.4)

2023-09-16 16:13:48.220 [info]   Obtaining dependency information for typing-extensions>=4.5.0 from https://files.pythonhosted.org/packages/ec/6b/63cc3df74987c36fe26157ee12e09e8f9db4de771e0f3404263117e75b95/typing_extensions-4.7.1-py3-none-any.whl.metadata

2023-09-16 16:13:48.222 [info]   Using cached typing_extensions-4.7.1-py3-none-any.whl.metadata (3.1 kB)

2023-09-16 16:13:48.633 [info] Collecting numpy>=1.20.0 (from llama_cpp_python==0.2.4)

2023-09-16 16:13:48.634 [info]   Obtaining dependency information for numpy>=1.20.0 from https://files.pythonhosted.org/packages/71/3c/3b1981c6a1986adc9ee7db760c0c34ea5b14ac3da9ecfcf1ea2a4ec6c398/numpy-1.25.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata

2023-09-16 16:13:48.635 [info]   Using cached numpy-1.25.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.6 kB)

2023-09-16 16:13:48.901 [info] Collecting diskcache>=5.6.1 (from llama_cpp_python==0.2.4)

2023-09-16 16:13:48.902 [info]   Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata

2023-09-16 16:13:48.903 [info]   Using cached diskcache-5.6.3-py3-none-any.whl.metadata (20 kB)

2023-09-16 16:13:48.908 [info] Using cached diskcache-5.6.3-py3-none-any.whl (45 kB)

2023-09-16 16:13:48.943 [info] Using cached numpy-1.25.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB)

2023-09-16 16:13:48.963 [info] Using cached typing_extensions-4.7.1-py3-none-any.whl (33 kB)

2023-09-16 16:13:48.984 [info] Building wheels for collected packages: llama_cpp_python

2023-09-16 16:13:48.985 [info]   Building editable for llama_cpp_python (pyproject.toml): started

2023-09-16 16:14:02.724 [info]   Building editable for llama_cpp_python (pyproject.toml): finished with status 'done'

2023-09-16 16:14:02.726 [info]   Created wheel for llama_cpp_python: filename=llama_cpp_python-0.2.4-cp310-cp310-manylinux_2_35_x86_64.whl size=921377 sha256=a18eb9ea296782fa2ff07b765d218acda0f1450c307a6622dc06da244376f058

2023-09-16 16:14:02.726 [info]   Stored in directory: /tmp/pip-ephem-wheel-cache-_u077z2v/wheels/c6/c9/22/31bac3ad9c62a2b3c4db79b3a4b417d3614d6dce3cbd3d47df

2023-09-16 16:14:02.728 [info] Successfully built llama_cpp_python

2023-09-16 16:14:02.759 [info] Installing collected packages: typing-extensions, numpy, diskcache, llama_cpp_python

2023-09-16 16:15:16.018 [info] Successfully installed diskcache-5.6.3 llama_cpp_python-0.2.4 numpy-1.25.2 typing-extensions-4.7.1

2023-09-16 16:15:16.301 [info] CREATE_VENV.PIP_INSTALLED_PYPROJECT

2023-09-16 16:15:16.481 [info] > ./.venv/bin/python -I ~/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/get_output_via_markers.py ~/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/interpreterInfo.py
2023-09-16 16:15:16.577 [info] Discover tests for workspace name: llama-cpp-python - uri: /mnt/d/git/llama-cpp-python
2023-09-16 16:15:16.578 [info] Python interpreter path: ./.venv/bin/python
2023-09-16 16:15:16.610 [info] > . ./.venv/bin/activate && echo 'e8b39361-0157-4923-80e1-22d70d46dee6' && python ~/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/printEnvVariables.py
2023-09-16 16:15:16.610 [info] shell: bash
2023-09-16 16:15:16.750 [info] > /bin/python3 ~/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/printEnvVariables.py
2023-09-16 16:15:16.750 [info] shell: bash
2023-09-16 16:16:31.825 [info] Running installed packages checker:  /mnt/d/git/llama-cpp-python/.venv/bin/python /home/miki/.vscode-server-insiders/extensions/ms-python.python-2023.17.12582041/pythonFiles/installed_check.py /mnt/d/git/llama-cpp-python/pyproject.toml

User Settings


languageServer: "Pylance"

Extension version: 2023.17.12582041
VS Code version: Code - Insiders 1.83.0-insider (bccfade64adb249f57c8fcf03cba41609f76ce5c, 2023-09-15T05:35:16.508Z)
OS version: Windows_NT x64 10.0.22621
Modes:
Remote OS version: Linux x64 5.15.90.1-microsoft-standard-WSL2

System Info
Item Value
CPUs 12th Gen Intel(R) Core(TM) i5-12400F (12 x 2496)
GPU Status 2d_canvas: enabled
canvas_oop_rasterization: enabled_on
direct_rendering_display_compositor: disabled_off_ok
gpu_compositing: enabled
multiple_raster_threads: enabled_on
opengl: enabled_on
rasterization: enabled
raw_draw: disabled_off_ok
video_decode: enabled
video_encode: enabled
vulkan: disabled_off
webgl: enabled
webgl2: enabled
webgpu: enabled
Load (avg) undefined
Memory (System) 15.79GB (6.80GB free)
Process Argv --crash-reporter-id 78c6d751-e4b0-4959-a782-10106e5f6b9d
Screen Reader yes
VM 0%
Item Value
Remote WSL: Ubuntu
OS Linux x64 5.15.90.1-microsoft-standard-WSL2
CPUs 12th Gen Intel(R) Core(TM) i5-12400F (12 x 2496)
Memory (System) 7.65GB (6.42GB free)
VM 0%
A/B Experiments
vsliv695:30137379
vsins829:30139715
vsliv368:30146709
vsreu685:30147344
python383cf:30185419
vspor879:30202332
vspor708:30202333
vspor363:30204092
vswsl492:30256197
vslsvsres303:30308271
pythontb:30258533
pythonptprofiler:30281269
vsdfh931cf:30280410
vshan820:30294714
vscod805:30301674
bridge0708:30335490
bridge0723:30353136
vsaa593cf:30376535
pythonvs932:30404738
py29gd2263:30784851
vsclangdf:30492506
c4g48928:30535728
dsvsc012cf:30540253
pynewext54:30618038
a9j8j154:30646983
showlangstatbar:30737417
57b77579:30687741
pythonfmttext:30716741
fixshowwlkth:30771523
showindicator:30805243
pythongtdpath:30726887
i26e3531:30792625
welcomedialog:30812478
pythonnosmt12:30779711
pythonidxpt:30768918
pythonnoceb:30776497
copilotsettingt:30808721
asynctok:30821568
dsvsc013:30777762
dsvsc014:30777825
diffeditorv2:30786206
pythonlinttype:30823781
pythonmpsinfo:30815194
dsvsc015:30821418
pythontestfixt:30826906
pythonfb280951:30830809

@github-actions github-actions bot added the triage-needed Needs assignment to the proper sub-team label Sep 16, 2023
@karthiknadig karthiknadig self-assigned this Sep 18, 2023
@karthiknadig
Copy link
Member

Currently, the Create Environment command installs pyproject.toml if it has a build.system table in it. The intent with this command is that it tries to follow the recommendations of the project. The choice you get with TOML currently is only if you have optional.dependencies table.

We discussed this with this team and we plan on making TOML a selectable option. Note that, when TOML is selected we do an editable install with the project. so, it will require and install any dependencies that are required by the project. If you want to create an environment and want to set it up in a particular way you will have to manually create it.

@github-actions github-actions bot added the info-needed Issue requires more information from poster label Sep 19, 2023
@karthiknadig karthiknadig added feature-request Request for new features or functionality area-environments Features relating to handling interpreter environments needs PR Ready to be worked on and removed info-needed Issue requires more information from poster triage-needed Needs assignment to the proper sub-team labels Sep 19, 2023
@karthiknadig karthiknadig removed their assignment Sep 19, 2023
@karthiknadig karthiknadig changed the title Python: Create environment command tries to install dependencies from pyproject.toml even when that file is not selected for installing dependencies Make pyproject.toml a selectable option when creating environments Sep 19, 2023
@eleanorjboyd
Copy link
Member

I think this is works now when I have tried something similar- let me know if you are still seeing this and I can re-open. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area-environments Features relating to handling interpreter environments feature-request Request for new features or functionality needs PR Ready to be worked on
Projects
None yet
Development

No branches or pull requests

3 participants