add support for artifact dependencies (#9096)

Tracking issue: https://github.com/rust-lang/cargo/issues/9096
Original PR: https://github.com/rust-lang/cargo/pull/9992

Add 'bindeps' -Z flag for later use

A test to validate artifact dependencies aren't currently parsed.

Parse 'artifact' and 'lib' fields.

Note that this isn't behind a feature toggle so 'unused' messages will
disappear.

Transfer artifact dependencies from toml- into manifest-dependencies

There are a few premises governing the operation.

- if unstable features are not set, warn when 'artifact' or 'lib' is
  encountered.
- bail if 'lib' is encountered alone, but warn that this WOULD happen
  with nightly.
- artifact parsing checks for all invariants, but some aren't tested.

Assure serialization of 'artifact' and 'lib' fields produces suitable values during publishing

This should be the only place were these fields matter and where a cargo
manifest is actually produced. These are only for internal use, no user
is typically going to see or edit them.

Place all artifact dependency tests inta their own module

This facilitates deduplication later and possibly redistribution into
other modules if there is a better fit.

Represent artifacts that are rust libraries as another ArtifactKind

This is more consistent and probably simpler for later use.
No need to reflect the TOML data structure.

Add tests to assure only 'lib = true' artifact deps are documented

RFC-3028 doesn't talk about documentation, but for lib=true it's clear
what the desired behaviour should be.
If an artifact isn't a library though, then for now, it's transparent,
maybe.

Many more tests, more documentation, mild `Artifact` refactor

The latter seems to be a better fit for what being an artifact
really means within cargo, as it literally turns being a library
on or off, and thus only optionally becoming a normal library.

refactor to prepare for artifact related checks

Don't show a no-lib warning for artifact dependencies (with lib = false)

Tests for more artifact dependency invariants

These are merely a proof of concept to show that we are not in
a position to actually figure out everything about artifacts
right after resolution.

However, the error message looks more like a fatal error and less
like something that can happen with a more elaborate error message
with causes.

This might show that these kind of checks might be better done later
right before trying to use the information for create compile units.

Validate that artifact deps with lib=true still trigger no-lib warnings

This triggers the same warning as before, for now without any
customization to indicate it's an artifact dependency.

Use warnings instead of errors
------------------------------

This avoids the kind of harsh end of compilation in favor of something
that can be recovered from. Since warnings are annoying, users will
probably avoid re-declaring artifact dependencies.

Hook in artifact dependencies into build script runs

Even though we would still have to see what happens if they have a lib
as well. Is it built twice?

Also
----

- fly-by refactor: fix typo; use ? in method returning option
- Propagate artifact information into Units; put artifacts into place

  This means artifacts now have their own place in the 'artifact'
  directory and uplifts won't happen for them.

- refactor and fix cippy suggestion
- fix build after rebasing onto master

Create directories when executing the job, and not when preparing it.

also: Get CI to work on windows the easy way, for now.

Set directories for artifact dependencies in build script runtimes

Test remaining kinds of build-script runtime environment variables

Also
----
- Fix windows tests, the quick way.
- Try to fix windows assertions, and generalize them
- Fix second test for windows, hopefully

test for available library dependency in build scripts with lib = true

probably generally exclude all artifact dependencies with lib=false.

Pass renamed dep names along with unit deps to allow proper artifact env names

Test for selective bin:<name> syntax, as well as binaries with dashes

Test to assure dependency names are transformed correctly

assure advertised binaries and directories are actually present

This wouldn't be the case if dependencies are not setup correctly,
for instance.

Also
----
 - make it easier to see actual values even on failure

   This should help figure out why on CI something fails that works
   locally no matter what.
   Turns out this is a race condition, with my machine being on the good
   side of it so it doesn't show in testing. Fortunately it still can be
   reproduced and easily tested for.

 - refactor test; the race condition is still present though

 - Force CI to pass here by avoiding checks triggering race.

 - Fix windows build, maybe?

More tolerant is_file() checks to account for delay on CI

This _should_ help CI to test for the presence which is better than
not testing at all.

This appears to be needed as the output file isn't ready/present in time
for some reason.

The root cause of this issue is unknown, but it's definitely a race
as it rarely happens locally. When it happened, the file was always
present after the run.
Now we will learn if it is truly not present, ever, or if it's maybe
something very else.

Validate libs also don't see artifact dependencies as libraries with lib=false

Also
----

 - Add prelimiary test for validating build-time artifacts
 - Try to fix CI on gnu windows

   Which apparently generates paths similar to linux, but with .exe suffix.
   The current linux patterns should match that.

 - refactor

   Help sharing code across modules

allow rustc to use artifact dep environment variables, but…

…it needs some adjustments to actually setup the unit dependency graph
with artifacts as well.
Right now it will only setup dependencies for artifacts that are libs,
but not the artifacts themselves, completely ignoring them when they
are not libs.

Make artifact dependencies available in main loop

This is the commit message #2:
------------------------------

rough cut of support for artifact dependencies at build time…

…which unfortunately already shows that the binary it is supposed to
include is reproducibly not ready in time even though the path is
correct and it's present right after the run.

Could it be related to rmeta?

This is the commit message #3:
------------------------------

Fix test expectations as failure is typical than the warning we had before…

…and add some tolerance to existing test to avoid occasional failures.

This doesn't change the issue that it also doens't work at all for
libraries, which is nicely reproducable and hopefully helps to fix
this issue.

This is the commit message #4:
------------------------------

Probably the fix for the dependency issue in the scheduler

This means that bin() targets are now properly added to the job graph
to cause proper syncing, whereas previously apparently it would
still schedule binaries, but somehow consider them rmeta and thus
start their dependents too early, leading to races.

This is the commit message #5:
------------------------------

Don't accidentally include non-gnu windows tests in gnu windows.

Support cargo doc and cargo check

The major changes here are…

- always compile artifacts in build mode, as we literally want the
  build output, always, which the dependent might rely on being present.
- share code between the rather similar looking paths for rustdoc and
  rustc.

Make artifact messages appear more in line with cargo by using backticks

Also: Add first test for static lib support in build scripts

build-scripts with support for cdylib and staticlib

 - Fix windows msvc build

   No need to speculate why the staticlib has hashes in the name even
   though nothing else.

staticlib and cdylib support for libraries

test staticlib and cdylibs for rustdoc as well.

Also catch a seemingly untested special case/warning about the lack
of linkable items, which probably shouldn't be an issue for artifacts
as they are not linkable in the traditional sense.

more useful test for 'cargo check'

`cargo check` isn't used very consistently in tests, so when we use it
we should be sure to actually try to use an artifact based feature
to gain some coverage.

verify that multiple versions are allowed for artifact deps as well.

also: remove redundant test

This is the commit message #2:
------------------------------

Properly choose which dependencies take part in artifact handling

Previously it would include them very generously without considering
the compatible dependency types.

This is the commit message #3:
------------------------------

a more complex test which includes dev-dependencies

It also shows that doc-tests don't yet work as rustdoc is run outside of
the system into which we integrate right now.

It should be possible to write our environment variable configuration
in terms of this 'finished compilation' though, hopefully with
most code reused.

This is the commit message #4:
------------------------------

A first stab at storing artifact environment variables for packages…

…however, it seems like the key for this isn't necessarily correct
under all circumstances. Maybe it should be something more specific,
don't know.

This is the commit message #5:
------------------------------

Adjust key for identifying units to Metadata

This one is actually unique and feels much better.

This is the commit message #6:
------------------------------

Attempt to make use of artifact environment information…

…but fail as the metadata won't match as the doctest unit is, of course,
its separate unit. Now I wonder if its possible to find the artifact
units in question that have the metadata.

Properly use metadata to use artifact environment variables in doctests

This is the commit message #2:
------------------------------

Add test for resolver = "2" and build dependencies

Interestingly the 'host-features' flag must be set (as is seemingly
documented in the flags documentation as well), even though I am not
quite sure if this is the 100% correct solution. Should it rather
have an entry with this flag being false in its map? Probably not…
but I am not quite certain.

This is the commit message #3:
------------------------------

set most if not all tests to use resolver = "2"

This allows to keep it working with the most recent version while
allowing to quickly test with "1" as well (which thus far was working
fine).

All tests I could imagine (excluding target and profiles) are working now

Crossplatform tests now run on architecture aarm64 as well.

More stringent negative testing

Fix incorrect handling of dependency directory computation

Previously it would just 'hack' the deps-dir to become something very
different for artifacts.

This could easily be fixed by putting the logic for artifact output
directories into the right spot.

A test for cargo-tree to indicate artifacts aren't handled specifically

Assure build-scripts can't access artifacts at build time

Actual doc-tests with access to artifact env vars

All relevant parsing of `target = [..]`

Next step is to actually take it into consideration.

A failing test for adjusting the target for build script artifacts using --target

Check for unknown artifact target triple in a place that exists for a year

The first test showing that `target="target"` deps seemingly work

For now only tested for build scripts, but it won't be much different
for non-build dependencies.

build scripts accept custom targets unconditionally

Support target setting for non-build dependencies

This is the commit message #2:
------------------------------

Add doc-test cross compile related test

Even though there is no artifact code specific to doc testing, it's
worth to try testing it with different target settings to validate
it still works despite doc tests having some special caseing around
target settings.

This is the commit message #3:
------------------------------

A test to validate profiles work as expected for build-deps and non-build deps

No change is required to make this work and artifact dependencies 'just work'
based on the typical rules of their non-artifact counterarts.

This is the commit message #4:
------------------------------

Adjust `cargo metadata` to deal with artifact dependencies

This commit was squashed and there is probably more that changed.

This is the commit message #5:
------------------------------

Show bin-only artifacts in "resolve" of metadata as well.

This is the commit message #6:
------------------------------

minor refactoring during research for RFC-3176

This will soon need to return multiple extern-name/dep-name pairs.

This is the commit message #7:
------------------------------

See if opt-level 3 works on win-msvc in basic profile test for artifacts

This is the same value as is used in the other test of the same name,
which certainly runs on windows.

This is the commit message #8:
------------------------------

refactor

Assure the type for targets reflect that they cannot be the host target,
which removes a few unreachable!() expressions.

Put `root_unit_compile_kind` into `UnitFor`

Previously that wasn't done because of the unused `all_values()`
method which has now been deleted as its not being used anyomre.

This allows for the root unit compile kind to be passed as originally
intended, instead of working around the previous lack of extendability
of UnitFor due to ::all_values().

This is also the basis for better/correct feature handling once
feature resolution can be depending on the artifact target as well,
resulting in another extension to UnitFor for that matter.

Also
----

 - Fix ordering

   Previously the re-created target_mode was used due to the reordering
   in code, and who knows what kind of effects that might have
   (despite the test suite being OK with it).

   Let's put it back in place.

 - Deactivate test with filename collision on MSVC until RFC-3176 lands

Avoid clashes with binaries called 'artifact' by putting 'artifact/' into './deps/'

This commit addresses review comment https://github.com/rust-lang/cargo/pull/9992#discussion_r772939834

Don't rely on operator precedence for boolean operations

Now it should be clear that no matter what the first term is,
if the unit is an artifact, we should enqueue it.

Replace boolean and `/*artifact*/ <bool>` with `IsArtifact::(Yes/No)`

fix `doc::doc_lib_false()` test

It broke due to major breakage in the way dependencies are calculated.

Now we differentiate between deps computation for docs and for building.

Avoid testing for doctest cross-compilation message

It seems to be present on my machine, but isn't on linux and it's
probably better to leave it out entirely and focus on the portions
of consecutive output that we want to see at least.

A test to validate features are unified across libraries and those in artifact deps in the same target

Allow aarch64 MacOS to crosscompile to an easily executable alternative target

That way more tests can run locally.

Support for feature resolution per target

The implementation is taken directly from RFC-3176 and notably lacks
the 'multidep' part.

Doing this definitely has the benefit of making entirely clear
'what is what' and helps to greatly reduce the scope of RFC-3176
when it's rebuilt based on the latest RF-3028, what we are implementing
right now.

Also
----
- A test which prooves that artifact deps with different target don't have a feature namespace yet

- Add a test to validate features are namespaced by target

  Previously it didn't work because it relies on resolver = "2".

- 'cargo metadata' test to see how artifact-deps are presented

- Missed an opportunity for using the newly introduced `PackageFeaturesKey`

- Use a HashMap to store name->value relations for artifact environment variables

  This is semantically closer to what's intended.

  also: Remove a by now misleading comment

Prevent resolver crash if `target = "target"` is encountered in non-build dependencies

A warning was emitted before, now we also apply a fix.

Previously the test didn't fail as it accidentally used the old
resolver, which now has been removed.

Abort in parsing stage if nightly flag is not set and 'artifact' is used

There is no good reason to delay errors to a later stage when code
tries to use artifacts via environment variables which are not present.

Change wording of warning message into what's expected for an error message

remove unnecessary `Result` in `collect()` call

Improve logic to warn if dependencie are ignored due to missing libraries

The improvement here is to trigger correctly if any dependency of a
crate is potentially a library, without having an actual library target
as part of the package specification.

Due to artifact dependencies it's also possible to have a dependency
to the same crate of the same version, hence the package name
isn't necessarily a unique name anymore. Now the name of the actual
dependency in the toml file is used to alleviate this.

Various small changes for readability and consistency

A failing test to validate artifacts work in published crates as well

Originally this should have been a test to see target acquisition works
but this more pressing issue surfaced instead.

Make artifacts known to the registry data (backwards compatible)

Now artifacts are serialized into the registry on publish (at least
if this code is actually used in the real crates-io registry) which
allows the resolve stage to contain artifact information.

This seems to be in line with the idea to provide cargo with all
information it needs to do package resolution without downloading
the actual manifest.

Pick up all artifact targets into target info once resolve data is available

Even though this works in the test at hand, it clearly shows there
is a cyclic dependency between the resolve and the target data.

In theory, one would have to repeat resolution until it settles
while avoiding cycles.

Maybe there is a better way.

Add `bindeps`/artifact dependencies to `unstsable.md` with examples

Fix tests

Various small improvements

Greatly simplify artifact environment propagation to commands

Remove all adjustments to cargo-metadata, but leave tests

The tests are to record the status quo with the current code
when artifact dependencies are present and assure the information
is not entirely non-sensical.

Revert "Make artifacts known to the registry data (backwards compatible)"

This reverts commit adc5f8ad04840af9fd06c964cfcdffb8c30769b0.

Ideally we are able to make it work without altering the registry
storage format. This could work if information from the package
set is added to the resolve information.

Enrich resolves information with additional information from downloaded manifests

Resolve information comes from the registry, and it's only as rich as
needed to know which packages take part in the build.

Artifacts, however, don't influence dependency resolution, hence it
shouldn't be part of it.

For artifact information being present nonetheless when it matters,
we port it back to the resolve graph where it will be needed later.

Collect 'forced-target' information from non-workspace members as well

This is needed as these targets aren't present in the registry and
thus can't be picked up by traversing non-workspce members.

The mechanism used to pick up artifact targets can also be used
to pick up these targets.

Remove unnecessary adjustment of doc test

refactor `State::deps()` to have filter; re-enable accidentally disabled test

The initial rebasing started out with a separted `deps_filtered()`
method to retain the original capabilities while minimizing the chance
for surprises. It turned out that the all changes combined in this PR
make heavy use of filtering capabilities to the point where
`deps(<without filter>)` was unused. This suggested that it's required
to keep it as is without a way to inline portions of it.

For the original change that triggered this rebase, see

bd45ac81ba

The fix originally made was reapplied by allowing to re-use the
required filter, but without inlining it.

Always error on invalid artifact setup, with or without enabled bindeps feature

Clarify how critical resolver code around artifact is working

Remove workaround in favor of deferring a proper implementation

See https://github.com/rust-lang/cargo/pull/9992#issuecomment-1033394197
for reference and the TODO in the ignored test for more information.

truncate comments at 80-90c; cleanup

- remove unused method
- remove '-Z unstable-options'
- improve error message
- improve the way MSVC special cases are targetted in tests
- improve how executables are found on non MSVC

Avoid depending on output of rustc

There is cyclic dependency between rustc and cargo which makes it
impossible to adjust cargo's expectations on rustc without leaving
broken commits in rustc and cargo.

Add missing documentation

fix incorrect removal of non-artifact libs

This is also the first step towards cleaning up the filtering logic
which is still making some logic harder to understand than needs be.

The goal is to get it to be closer to what's currently on master.

Another test was added to have more safety regarding the overall
library inclusion logic.

inline `build_artifact_requirements_to_units()`

Simplify filtering

This adds a default filter to `state.deps(…)` making it similar to
what's currently in master, while creating another version of it
to allow setting a custom filter. This is needed as the default filter
won't allow build dependencies, which we need in this particular case.

`calc_artifact_deps(…)` now hard-codes the default filter which is
needed due to the use of `any` here:
c0e6abe384/src/cargo/core/compiler/unit_dependencies.rs (L1119)
.

Simplify filtering.
This commit is contained in:
Sebastian Thiel 2021-10-21 17:57:23 +08:00
parent 86376c8dd4
commit 7248f4b70d
No known key found for this signature in database
GPG Key ID: 9CB5EE7895E8268B
36 changed files with 4321 additions and 402 deletions

View File

@ -191,6 +191,7 @@ pub fn native_arch() -> &'static str {
.expect("Target triple has unexpected format")
{
"x86_64" => "x86_64",
"aarch64" => "aarch64",
"i686" => "x86",
_ => panic!("This test should be gated on cross_compile::disabled."),
}
@ -200,7 +201,9 @@ pub fn native_arch() -> &'static str {
///
/// Only use this function on tests that check `cross_compile::disabled`.
pub fn alternate() -> &'static str {
if cfg!(target_os = "macos") {
if cfg!(all(target_os = "macos", target_arch = "aarch64")) {
"x86_64-apple-darwin"
} else if cfg!(target_os = "macos") {
"x86_64-apple-ios"
} else if cfg!(target_os = "linux") {
"i686-unknown-linux-gnu"

View File

@ -331,6 +331,7 @@ pub struct Dependency {
name: String,
vers: String,
kind: String,
artifact: Option<(String, Option<String>)>,
target: Option<String>,
features: Vec<String>,
registry: Option<String>,
@ -591,6 +592,7 @@ impl Package {
"features": dep.features,
"default_features": true,
"target": dep.target,
"artifact": dep.artifact,
"optional": dep.optional,
"kind": dep.kind,
"registry": registry_url,
@ -744,6 +746,12 @@ impl Package {
"#,
target, kind, dep.name, dep.vers
));
if let Some((artifact, target)) = &dep.artifact {
manifest.push_str(&format!("artifact = \"{}\"\n", artifact));
if let Some(target) = &target {
manifest.push_str(&format!("target = \"{}\"\n", target))
}
}
if let Some(registry) = &dep.registry {
assert_eq!(registry, "alternative");
manifest.push_str(&format!("registry-index = \"{}\"", alt_registry_url()));
@ -799,6 +807,7 @@ impl Dependency {
name: name.to_string(),
vers: vers.to_string(),
kind: "normal".to_string(),
artifact: None,
target: None,
features: Vec::new(),
package: None,
@ -825,6 +834,13 @@ impl Dependency {
self
}
/// Change the artifact to be of the given kind, like "bin", or "staticlib",
/// along with a specific target triple if provided.
pub fn artifact(&mut self, kind: &str, target: Option<String>) -> &mut Self {
self.artifact = Some((kind.to_string(), target));
self
}
/// Adds `registry = $registry` to this dependency.
pub fn registry(&mut self, registry: &str) -> &mut Self {
self.registry = Some(registry.to_string());

View File

@ -0,0 +1,57 @@
/// Generate artifact information from unit dependencies for configuring the compiler environment.
use crate::core::compiler::unit_graph::UnitDep;
use crate::core::compiler::{Context, CrateType, FileFlavor, Unit};
use crate::core::TargetKind;
use crate::CargoResult;
use std::collections::HashMap;
use std::ffi::OsString;
/// Return all environment variables for the given unit-dependencies
/// if artifacts are present.
pub fn get_env(
cx: &Context<'_, '_>,
dependencies: &[UnitDep],
) -> CargoResult<HashMap<String, OsString>> {
let mut env = HashMap::new();
for unit_dep in dependencies.iter().filter(|d| d.unit.artifact.is_true()) {
for artifact_path in cx
.outputs(&unit_dep.unit)?
.iter()
.filter_map(|f| (f.flavor == FileFlavor::Normal).then(|| &f.path))
{
let artifact_type_upper = unit_artifact_type_name_upper(&unit_dep.unit);
let dep_name = unit_dep.dep_name.unwrap_or(unit_dep.unit.pkg.name());
let dep_name_upper = dep_name.to_uppercase().replace("-", "_");
let var = format!("CARGO_{}_DIR_{}", artifact_type_upper, dep_name_upper);
let path = artifact_path.parent().expect("parent dir for artifacts");
env.insert(var, path.to_owned().into());
let var = format!(
"CARGO_{}_FILE_{}_{}",
artifact_type_upper,
dep_name_upper,
unit_dep.unit.target.name()
);
env.insert(var, artifact_path.to_owned().into());
if unit_dep.unit.target.name() == dep_name.as_str() {
let var = format!("CARGO_{}_FILE_{}", artifact_type_upper, dep_name_upper,);
env.insert(var, artifact_path.to_owned().into());
}
}
}
Ok(env)
}
fn unit_artifact_type_name_upper(unit: &Unit) -> &'static str {
match unit.target.kind() {
TargetKind::Lib(kinds) => match kinds.as_slice() {
&[CrateType::Cdylib] => "CDYLIB",
&[CrateType::Staticlib] => "STATICLIB",
invalid => unreachable!("BUG: artifacts cannot be of type {:?}", invalid),
},
TargetKind::Bin => "BIN",
invalid => unreachable!("BUG: artifacts cannot be of type {:?}", invalid),
}
}

View File

@ -1,7 +1,7 @@
use crate::core::compiler::{
BuildOutput, CompileKind, CompileMode, CompileTarget, Context, CrateType,
};
use crate::core::{Dependency, Target, TargetKind, Workspace};
use crate::core::{Dependency, Package, Target, TargetKind, Workspace};
use crate::util::config::{Config, StringList, TargetConfig};
use crate::util::{CargoResult, Rustc};
use anyhow::Context as _;
@ -748,11 +748,17 @@ impl<'cfg> RustcTargetData<'cfg> {
// Get all kinds we currently know about.
//
// For now, targets can only ever come from the root workspace
// units as artifact dependencies are not a thing yet, so this
// correctly represents all the kinds that can happen. When we
// have artifact dependencies or other ways for targets to
// appear at places that are not the root units, we may have
// to revisit this.
// units and artifact dependencies, so this
// correctly represents all the kinds that can happen. When we have
// other ways for targets to appear at places that are not the root units,
// we may have to revisit this.
fn artifact_targets(package: &Package) -> impl Iterator<Item = CompileKind> + '_ {
package
.manifest()
.dependencies()
.iter()
.filter_map(|d| d.artifact()?.target()?.to_compile_kind())
}
let all_kinds = requested_kinds
.iter()
.copied()
@ -761,25 +767,32 @@ impl<'cfg> RustcTargetData<'cfg> {
.default_kind()
.into_iter()
.chain(p.manifest().forced_kind())
.chain(artifact_targets(p))
}));
for kind in all_kinds {
if let CompileKind::Target(target) = kind {
if !res.target_config.contains_key(&target) {
res.target_config
.insert(target, res.config.target_cfg_triple(target.short_name())?);
}
if !res.target_info.contains_key(&target) {
res.target_info.insert(
target,
TargetInfo::new(res.config, &res.requested_kinds, &res.rustc, kind)?,
);
}
}
res.merge_compile_kind(kind)?;
}
Ok(res)
}
/// Insert `kind` into our `target_info` and `target_config` members if it isn't present yet.
fn merge_compile_kind(&mut self, kind: CompileKind) -> CargoResult<()> {
if let CompileKind::Target(target) = kind {
if !self.target_config.contains_key(&target) {
self.target_config
.insert(target, self.config.target_cfg_triple(target.short_name())?);
}
if !self.target_info.contains_key(&target) {
self.target_info.insert(
target,
TargetInfo::new(self.config, &self.requested_kinds, &self.rustc, kind)?,
);
}
}
Ok(())
}
/// Returns a "short" name for the given kind, suitable for keying off
/// configuration in Cargo or presenting to users.
pub fn short_name<'a>(&'a self, kind: &'a CompileKind) -> &'a str {

View File

@ -25,6 +25,9 @@ pub struct Doctest {
///
/// This is used for indexing [`Compilation::extra_env`].
pub script_meta: Option<Metadata>,
/// Environment variables to set in the rustdoc process.
pub env: HashMap<String, OsString>,
}
/// Information about the output of a unit.
@ -190,14 +193,14 @@ impl<'cfg> Compilation<'cfg> {
) -> CargoResult<ProcessBuilder> {
let rustdoc = ProcessBuilder::new(&*self.config.rustdoc()?);
let cmd = fill_rustc_tool_env(rustdoc, unit);
let mut p = self.fill_env(cmd, &unit.pkg, script_meta, unit.kind, true)?;
unit.target.edition().cmd_edition_arg(&mut p);
let mut cmd = self.fill_env(cmd, &unit.pkg, script_meta, unit.kind, true)?;
unit.target.edition().cmd_edition_arg(&mut cmd);
for crate_type in unit.target.rustc_crate_types() {
p.arg("--crate-type").arg(crate_type.as_str());
cmd.arg("--crate-type").arg(crate_type.as_str());
}
Ok(p)
Ok(cmd)
}
/// Returns a [`ProcessBuilder`] appropriate for running a process for the

View File

@ -158,8 +158,8 @@ impl CompileTarget {
/// Typically this is pretty much the same as `short_name`, but for the case
/// of JSON target files this will be a full canonicalized path name for the
/// current filesystem.
pub fn rustc_target(&self) -> &str {
&self.name
pub fn rustc_target(&self) -> InternedString {
self.name
}
/// Returns a "short" version of the target name suitable for usage within

View File

@ -201,6 +201,8 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
self.build_script_dir(unit)
} else if unit.target.is_example() {
self.layout(unit.kind).examples().to_path_buf()
} else if unit.artifact.is_true() {
self.artifact_dir(unit)
} else {
self.deps_dir(unit).to_path_buf()
}
@ -287,6 +289,30 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
self.layout(CompileKind::Host).build().join(dir)
}
/// Returns the directory for compiled artifacts files.
/// `/path/to/target/{debug,release}/deps/artifact/KIND/PKG-HASH`
fn artifact_dir(&self, unit: &Unit) -> PathBuf {
assert!(self.metas.contains_key(unit));
assert!(unit.artifact.is_true());
let dir = self.pkg_dir(unit);
let kind = match unit.target.kind() {
TargetKind::Bin => "bin",
TargetKind::Lib(lib_kinds) => match lib_kinds.as_slice() {
&[CrateType::Cdylib] => "cdylib",
&[CrateType::Staticlib] => "staticlib",
invalid => unreachable!(
"BUG: unexpected artifact library type(s): {:?} - these should have been split",
invalid
),
},
invalid => unreachable!(
"BUG: {:?} are not supposed to be used as artifacts",
invalid
),
};
self.layout(unit.kind).artifact().join(dir).join(kind)
}
/// Returns the directory where information about running a build script
/// is stored.
/// `/path/to/target/{debug,release}/build/PKG-HASH`
@ -354,7 +380,12 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
if unit.mode != CompileMode::Build || file_type.flavor == FileFlavor::Rmeta {
return None;
}
// Only uplift:
// Artifact dependencies are never uplifted.
if unit.artifact.is_true() {
return None;
}
// - Binaries: The user always wants to see these, even if they are
// implicitly built (for example for integration tests).
// - dylibs: This ensures that the dynamic linker pulls in all the

View File

@ -3,7 +3,7 @@ use std::path::{Path, PathBuf};
use std::sync::{Arc, Mutex};
use crate::core::compiler::compilation::{self, UnitOutput};
use crate::core::compiler::{self, Unit};
use crate::core::compiler::{self, artifact, Unit};
use crate::core::PackageId;
use crate::util::errors::CargoResult;
use crate::util::profile;
@ -133,7 +133,7 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
// We need to make sure that if there were any previous docs
// already compiled, they were compiled with the same Rustc version that we're currently
// using. Otherways we must remove the `doc/` folder and compile again forcing a rebuild.
// using. Otherwise we must remove the `doc/` folder and compile again forcing a rebuild.
//
// This is important because the `.js`/`.html` & `.css` files that are generated by Rustc don't have
// any versioning (See https://github.com/rust-lang/cargo/issues/8461).
@ -262,6 +262,7 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
unstable_opts,
linker: self.bcx.linker(unit.kind),
script_meta,
env: artifact::get_env(&self, self.unit_deps(unit))?,
});
}

View File

@ -1,5 +1,6 @@
use super::job::{Freshness, Job, Work};
use super::{fingerprint, Context, LinkType, Unit};
use crate::core::compiler::artifact;
use crate::core::compiler::context::Metadata;
use crate::core::compiler::job_queue::JobState;
use crate::core::{profiles::ProfileRoot, PackageId, Target};
@ -203,6 +204,11 @@ fn build_work(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<Job> {
.env("RUSTDOC", &*bcx.config.rustdoc()?)
.inherit_jobserver(&cx.jobserver);
// Find all artifact dependencies and make their file and containing directory discoverable using environment variables.
for (var, value) in artifact::get_env(cx, dependencies)? {
cmd.env(&var, value);
}
if let Some(linker) = &bcx.target_data.target_config(unit.kind).linker {
cmd.env(
"RUSTC_LINKER",

View File

@ -395,7 +395,8 @@ impl<'cfg> JobQueue<'cfg> {
.filter(|dep| {
// Binaries aren't actually needed to *compile* tests, just to run
// them, so we don't include this dependency edge in the job graph.
!dep.unit.target.is_test() && !dep.unit.target.is_bin()
(!dep.unit.target.is_test() && !dep.unit.target.is_bin())
|| dep.unit.artifact.is_true()
})
.map(|dep| {
// Handle the case here where our `unit -> dep` dependency may

View File

@ -47,6 +47,9 @@
//! # prevent collisions. One notable exception is dynamic libraries.
//! deps/
//!
//! # Each artifact dependency gets in its own directory.
//! /artifact/$pkgname-$META/$kind
//!
//! # Root directory for all compiled examples.
//! examples/
//!
@ -117,6 +120,8 @@ pub struct Layout {
deps: PathBuf,
/// The directory for build scripts: `$dest/build`
build: PathBuf,
/// The directory for artifacts, i.e. binaries, cdylibs, staticlibs: `$dest/deps/artifact`
artifact: PathBuf,
/// The directory for incremental files: `$dest/incremental`
incremental: PathBuf,
/// The directory for fingerprints: `$dest/.fingerprint`
@ -164,10 +169,13 @@ impl Layout {
let lock = dest.open_rw(".cargo-lock", ws.config(), "build directory")?;
let root = root.into_path_unlocked();
let dest = dest.into_path_unlocked();
let deps = dest.join("deps");
let artifact = deps.join("artifact");
Ok(Layout {
deps: dest.join("deps"),
deps,
build: dest.join("build"),
artifact,
incremental: dest.join("incremental"),
fingerprint: dest.join(".fingerprint"),
examples: dest.join("examples"),
@ -222,6 +230,10 @@ impl Layout {
pub fn build(&self) -> &Path {
&self.build
}
/// Fetch the artifact path.
pub fn artifact(&self) -> &Path {
&self.artifact
}
/// Create and return the tmp path.
pub fn prepare_tmp(&self) -> CargoResult<&Path> {
paths::create_dir_all(&self.tmp)?;

View File

@ -1,3 +1,4 @@
pub mod artifact;
mod build_config;
mod build_context;
mod build_plan;
@ -261,8 +262,16 @@ fn rustc(cx: &mut Context<'_, '_>, unit: &Unit, exec: &Arc<dyn Executor>) -> Car
let fingerprint_dir = cx.files().fingerprint_dir(unit);
let script_metadata = cx.find_build_script_metadata(unit);
let is_local = unit.is_local();
let artifact = unit.artifact;
return Ok(Work::new(move |state| {
// Artifacts are in a different location than typical units,
// hence we must assure the crate- and target-dependent
// directory is present.
if artifact.is_true() {
paths::create_dir_all(&root)?;
}
// Only at runtime have we discovered what the extra -L and -l
// arguments are for native libraries, so we process those here. We
// also need to be sure to add any -L paths for our plugins to the
@ -1105,10 +1114,9 @@ fn build_deps_args(
.iter()
.any(|dep| !dep.unit.mode.is_doc() && dep.unit.target.is_linkable())
{
if let Some(dep) = deps
.iter()
.find(|dep| !dep.unit.mode.is_doc() && dep.unit.target.is_lib())
{
if let Some(dep) = deps.iter().find(|dep| {
!dep.unit.mode.is_doc() && dep.unit.target.is_lib() && !dep.unit.artifact.is_true()
}) {
bcx.config.shell().warn(format!(
"The package `{}` \
provides no linkable target. The compiler might raise an error while compiling \
@ -1133,6 +1141,10 @@ fn build_deps_args(
cmd.arg(arg);
}
for (var, env) in artifact::get_env(cx, deps)? {
cmd.env(&var, env);
}
// This will only be set if we're already using a feature
// requiring nightly rust
if unstable_opts {

View File

@ -1,5 +1,6 @@
//! Code for building the standard library.
use crate::core::compiler::unit_dependencies::IsArtifact;
use crate::core::compiler::UnitInterner;
use crate::core::compiler::{CompileKind, CompileMode, RustcTargetData, Unit};
use crate::core::profiles::{Profiles, UnitFor};
@ -153,15 +154,17 @@ pub fn generate_std_roots(
.iter()
.find(|t| t.is_lib())
.expect("std has a lib");
let unit_for = UnitFor::new_normal();
// I don't think we need to bother with Check here, the difference
// in time is minimal, and the difference in caching is
// significant.
let mode = CompileMode::Build;
let features = std_features.activated_features(pkg.package_id(), FeaturesFor::NormalOrDev);
let features = std_features.activated_features(
pkg.package_id(),
FeaturesFor::NormalOrDevOrArtifactTarget(None),
);
for kind in kinds {
let list = ret.entry(*kind).or_insert_with(Vec::new);
let unit_for = UnitFor::new_normal(*kind);
let profile = profiles.get_profile(
pkg.package_id(),
/*is_member*/ false,
@ -179,6 +182,7 @@ pub fn generate_std_roots(
features.clone(),
/*is_std*/ true,
/*dep_hash*/ 0,
IsArtifact::No,
));
}
}

View File

@ -1,4 +1,4 @@
use crate::core::compiler::{CompileKind, CompileMode, CrateType};
use crate::core::compiler::{unit_dependencies::IsArtifact, CompileKind, CompileMode, CrateType};
use crate::core::manifest::{Target, TargetKind};
use crate::core::{profiles::Profile, Package};
use crate::util::hex::short_hash;
@ -55,6 +55,9 @@ pub struct UnitInner {
/// The `cfg` features to enable for this unit.
/// This must be sorted.
pub features: Vec<InternedString>,
// if `true`, the dependency is an artifact dependency, requiring special handling when
// calculating output directories, linkage and environment variables provided to builds.
pub artifact: IsArtifact,
/// Whether this is a standard library unit.
pub is_std: bool,
/// A hash of all dependencies of this unit.
@ -135,6 +138,7 @@ impl fmt::Debug for Unit {
.field("kind", &self.kind)
.field("mode", &self.mode)
.field("features", &self.features)
.field("artifact", &self.artifact.is_true())
.field("is_std", &self.is_std)
.field("dep_hash", &self.dep_hash)
.finish()
@ -179,6 +183,7 @@ impl UnitInterner {
features: Vec<InternedString>,
is_std: bool,
dep_hash: u64,
artifact: IsArtifact,
) -> Unit {
let target = match (is_std, target.kind()) {
// This is a horrible hack to support build-std. `libstd` declares
@ -210,6 +215,7 @@ impl UnitInterner {
features,
is_std,
dep_hash,
artifact,
});
Unit { inner }
}

View File

@ -15,19 +15,25 @@
//! (for example, with and without tests), so we actually build a dependency
//! graph of `Unit`s, which capture these properties.
use std::collections::{HashMap, HashSet};
use log::trace;
use crate::core::compiler::unit_graph::{UnitDep, UnitGraph};
use crate::core::compiler::UnitInterner;
use crate::core::compiler::{CompileKind, CompileMode, RustcTargetData, Unit};
use crate::core::compiler::{
CompileKind, CompileMode, CrateType, RustcTargetData, Unit, UnitInterner,
};
use crate::core::dependency::{Artifact, ArtifactKind, ArtifactTarget, DepKind};
use crate::core::profiles::{Profile, Profiles, UnitFor};
use crate::core::resolver::features::{FeaturesFor, ResolvedFeatures};
use crate::core::resolver::Resolve;
use crate::core::{Dependency, Package, PackageId, PackageSet, Target, Workspace};
use crate::core::{Dependency, Package, PackageId, PackageSet, Target, TargetKind, Workspace};
use crate::ops::resolve_all_features;
use crate::util::interning::InternedString;
use crate::util::Config;
use crate::CargoResult;
use log::trace;
use std::collections::{HashMap, HashSet};
const IS_NO_ARTIFACT_DEP: Option<&'static Artifact> = None;
/// Collection of stuff used while creating the `UnitGraph`.
struct State<'a, 'cfg> {
@ -54,6 +60,19 @@ struct State<'a, 'cfg> {
dev_dependency_edges: HashSet<(Unit, Unit)>,
}
/// A boolean-like to indicate if a `Unit` is an artifact or not.
#[derive(Copy, Clone, Hash, PartialEq, Eq, PartialOrd, Ord)]
pub enum IsArtifact {
Yes,
No,
}
impl IsArtifact {
pub fn is_true(&self) -> bool {
matches!(self, IsArtifact::Yes)
}
}
pub fn build_unit_dependencies<'a, 'cfg>(
ws: &'a Workspace<'cfg>,
package_set: &'a PackageSet<'cfg>,
@ -149,8 +168,9 @@ fn attach_std_deps(
if !unit.kind.is_host() && !unit.mode.is_run_custom_build() {
deps.extend(std_roots[&unit.kind].iter().map(|unit| UnitDep {
unit: unit.clone(),
unit_for: UnitFor::new_normal(),
unit_for: UnitFor::new_normal(unit.kind),
extern_crate_name: unit.pkg.name(),
dep_name: None,
// TODO: Does this `public` make sense?
public: true,
noprelude: true,
@ -179,25 +199,26 @@ fn deps_of_roots(roots: &[Unit], state: &mut State<'_, '_>) -> CargoResult<()> {
// cleared, and avoid building the lib thrice (once with `panic`, once
// without, once for `--test`). In particular, the lib included for
// Doc tests and examples are `Build` mode here.
let root_compile_kind = unit.kind;
let unit_for = if unit.mode.is_any_test() || state.global_mode.is_rustc_test() {
if unit.target.proc_macro() {
// Special-case for proc-macros, which are forced to for-host
// since they need to link with the proc_macro crate.
UnitFor::new_host_test(state.config)
UnitFor::new_host_test(state.config, root_compile_kind)
} else {
UnitFor::new_test(state.config)
UnitFor::new_test(state.config, root_compile_kind)
}
} else if unit.target.is_custom_build() {
// This normally doesn't happen, except `clean` aggressively
// generates all units.
UnitFor::new_host(false)
UnitFor::new_host(false, root_compile_kind)
} else if unit.target.proc_macro() {
UnitFor::new_host(true)
UnitFor::new_host(true, root_compile_kind)
} else if unit.target.for_host() {
// Plugin should never have panic set.
UnitFor::new_compiler()
UnitFor::new_compiler(root_compile_kind)
} else {
UnitFor::new_normal()
UnitFor::new_normal(root_compile_kind)
};
deps_of(unit, state, unit_for)?;
}
@ -241,37 +262,55 @@ fn compute_deps(
return compute_deps_doc(unit, state, unit_for);
}
let id = unit.pkg.package_id();
let filtered_deps = state.deps(unit, unit_for);
let mut ret = Vec::new();
let mut dev_deps = Vec::new();
for (id, deps) in filtered_deps {
let pkg = state.get(id);
let lib = match pkg.targets().iter().find(|t| t.is_lib()) {
Some(t) => t,
for (dep_pkg_id, deps) in state.deps(unit, unit_for) {
let dep_lib = match calc_artifact_deps(unit, unit_for, dep_pkg_id, &deps, state, &mut ret)?
{
Some(lib) => lib,
None => continue,
};
let mode = check_or_build_mode(unit.mode, lib);
let dep_unit_for = unit_for.with_dependency(unit, lib);
let dep_pkg = state.get(dep_pkg_id);
let mode = check_or_build_mode(unit.mode, dep_lib);
let dep_unit_for = unit_for.with_dependency(unit, dep_lib, unit_for.root_compile_kind());
let start = ret.len();
if state.config.cli_unstable().dual_proc_macros && lib.proc_macro() && !unit.kind.is_host()
if state.config.cli_unstable().dual_proc_macros
&& dep_lib.proc_macro()
&& !unit.kind.is_host()
{
let unit_dep = new_unit_dep(state, unit, pkg, lib, dep_unit_for, unit.kind, mode)?;
let unit_dep = new_unit_dep(
state,
unit,
dep_pkg,
dep_lib,
dep_unit_for,
unit.kind,
mode,
IS_NO_ARTIFACT_DEP,
)?;
ret.push(unit_dep);
let unit_dep =
new_unit_dep(state, unit, pkg, lib, dep_unit_for, CompileKind::Host, mode)?;
let unit_dep = new_unit_dep(
state,
unit,
dep_pkg,
dep_lib,
dep_unit_for,
CompileKind::Host,
mode,
IS_NO_ARTIFACT_DEP,
)?;
ret.push(unit_dep);
} else {
let unit_dep = new_unit_dep(
state,
unit,
pkg,
lib,
dep_pkg,
dep_lib,
dep_unit_for,
unit.kind.for_target(lib),
unit.kind.for_target(dep_lib),
mode,
IS_NO_ARTIFACT_DEP,
)?;
ret.push(unit_dep);
}
@ -311,6 +350,7 @@ fn compute_deps(
&& unit.mode.is_any_test()
&& (unit.target.is_test() || unit.target.is_bench())
{
let id = unit.pkg.package_id();
ret.extend(
unit.pkg
.targets()
@ -337,9 +377,10 @@ fn compute_deps(
unit,
&unit.pkg,
t,
UnitFor::new_normal(),
UnitFor::new_normal(unit_for.root_compile_kind()),
unit.kind.for_target(t),
CompileMode::Build,
IS_NO_ARTIFACT_DEP,
)
})
.collect::<CargoResult<Vec<UnitDep>>>()?,
@ -349,6 +390,58 @@ fn compute_deps(
Ok(ret)
}
/// Find artifacts for all `deps` of `unit` and add units that build these artifacts
/// to `ret`.
fn calc_artifact_deps<'a>(
unit: &Unit,
unit_for: UnitFor,
dep_id: PackageId,
deps: &[&Dependency],
state: &State<'a, '_>,
ret: &mut Vec<UnitDep>,
) -> CargoResult<Option<&'a Target>> {
let mut has_artifact_lib = false;
let mut maybe_non_artifact_lib = false;
let artifact_pkg = state.get(dep_id);
for dep in deps {
let artifact = match dep.artifact() {
Some(a) => a,
None => {
maybe_non_artifact_lib = true;
continue;
}
};
has_artifact_lib |= artifact.is_lib();
// Custom build scripts (build/compile) never get artifact dependencies,
// but the run-build-script step does (where it is handled).
if !unit.target.is_custom_build() {
debug_assert!(
!unit.mode.is_run_custom_build(),
"BUG: This should be handled in a separate branch"
);
ret.extend(artifact_targets_to_unit_deps(
unit,
unit_for.with_artifact_features(artifact),
state,
artifact
.target()
.and_then(|t| match t {
ArtifactTarget::BuildDependencyAssumeTarget => None,
ArtifactTarget::Force(kind) => Some(CompileKind::Target(kind)),
})
.unwrap_or(unit.kind),
artifact_pkg,
dep,
)?);
}
}
if has_artifact_lib || maybe_non_artifact_lib {
Ok(artifact_pkg.targets().iter().find(|t| t.is_lib()))
} else {
Ok(None)
}
}
/// Returns the dependencies needed to run a build script.
///
/// The `unit` provided must represent an execution of a build script, and
@ -356,7 +449,7 @@ fn compute_deps(
fn compute_deps_custom_build(
unit: &Unit,
unit_for: UnitFor,
state: &mut State<'_, '_>,
state: &State<'_, '_>,
) -> CargoResult<Vec<UnitDep>> {
if let Some(links) = unit.pkg.manifest().links() {
if state
@ -371,7 +464,10 @@ fn compute_deps_custom_build(
// All dependencies of this unit should use profiles for custom builds.
// If this is a build script of a proc macro, make sure it uses host
// features.
let script_unit_for = UnitFor::new_host(unit_for.is_for_host_features());
let script_unit_for = UnitFor::new_host(
unit_for.is_for_host_features(),
unit_for.root_compile_kind(),
);
// When not overridden, then the dependencies to run a build script are:
//
// 1. Compiling the build script itself.
@ -381,7 +477,7 @@ fn compute_deps_custom_build(
// We don't have a great way of handling (2) here right now so this is
// deferred until after the graph of all unit dependencies has been
// constructed.
let unit_dep = new_unit_dep(
let compile_script_unit = new_unit_dep(
state,
unit,
&unit.pkg,
@ -390,8 +486,151 @@ fn compute_deps_custom_build(
// Build scripts always compiled for the host.
CompileKind::Host,
CompileMode::Build,
IS_NO_ARTIFACT_DEP,
)?;
Ok(vec![unit_dep])
let mut result = vec![compile_script_unit];
// Include any artifact dependencies.
//
// This is essentially the same as `calc_artifact_deps`, but there are some
// subtle differences that require this to be implemented differently.
//
// Produce units that build all required artifact kinds (like binaries,
// static libraries, etc) with the correct compile target.
//
// Computing the compile target for artifact units is more involved as it has to handle
// various target configurations specific to artifacts, like `target = "target"` and
// `target = "<triple>"`, which makes knowing the root units compile target
// `root_unit_compile_target` necessary.
let root_unit_compile_target = unit_for.root_compile_kind();
let unit_for = UnitFor::new_host(/*host_features*/ true, root_unit_compile_target);
for (dep_pkg_id, deps) in state.deps(unit, script_unit_for) {
for dep in deps {
if dep.kind() != DepKind::Build || dep.artifact().is_none() {
continue;
}
let artifact_pkg = state.get(dep_pkg_id);
let artifact = dep.artifact().expect("artifact dep");
let resolved_artifact_compile_kind = artifact
.target()
.map(|target| target.to_resolved_compile_kind(root_unit_compile_target));
result.extend(artifact_targets_to_unit_deps(
unit,
unit_for.with_artifact_features_from_resolved_compile_kind(
resolved_artifact_compile_kind,
),
state,
resolved_artifact_compile_kind.unwrap_or(CompileKind::Host),
artifact_pkg,
dep,
)?);
}
}
Ok(result)
}
/// Given a `parent` unit containing a dependency `dep` whose package is `artifact_pkg`,
/// find all targets in `artifact_pkg` which refer to the `dep`s artifact declaration
/// and turn them into units.
/// Due to the nature of artifact dependencies, a single dependency in a manifest can
/// cause one or more targets to be build, for instance with
/// `artifact = ["bin:a", "bin:b", "staticlib"]`, which is very different from normal
/// dependencies which cause only a single unit to be created.
///
/// `compile_kind` is the computed kind for the future artifact unit
/// dependency, only the caller can pick the correct one.
fn artifact_targets_to_unit_deps(
parent: &Unit,
parent_unit_for: UnitFor,
state: &State<'_, '_>,
compile_kind: CompileKind,
artifact_pkg: &Package,
dep: &Dependency,
) -> CargoResult<Vec<UnitDep>> {
let ret =
match_artifacts_kind_with_targets(dep, artifact_pkg.targets(), parent.pkg.name().as_str())?
.into_iter()
.flat_map(|target| {
// We split target libraries into individual units, even though rustc is able
// to produce multiple kinds in an single invocation for the sole reason that
// each artifact kind has its own output directory, something we can't easily
// teach rustc for now.
match target.kind() {
TargetKind::Lib(kinds) => Box::new(
kinds
.iter()
.filter(|tk| matches!(tk, CrateType::Cdylib | CrateType::Staticlib))
.map(|target_kind| {
new_unit_dep(
state,
parent,
artifact_pkg,
target
.clone()
.set_kind(TargetKind::Lib(vec![target_kind.clone()])),
parent_unit_for,
compile_kind,
CompileMode::Build,
dep.artifact(),
)
}),
) as Box<dyn Iterator<Item = _>>,
_ => Box::new(std::iter::once(new_unit_dep(
state,
parent,
artifact_pkg,
target,
parent_unit_for,
compile_kind,
CompileMode::Build,
dep.artifact(),
))),
}
})
.collect::<Result<Vec<_>, _>>()?;
Ok(ret)
}
/// Given a dependency with an artifact `artifact_dep` and a set of available `targets`
/// of its package, find a target for each kind of artifacts that are to be built.
///
/// Failure to match any target results in an error mentioning the parent manifests
/// `parent_package` name.
fn match_artifacts_kind_with_targets<'a>(
artifact_dep: &Dependency,
targets: &'a [Target],
parent_package: &str,
) -> CargoResult<HashSet<&'a Target>> {
let mut out = HashSet::new();
let artifact_requirements = artifact_dep.artifact().expect("artifact present");
for artifact_kind in artifact_requirements.kinds() {
let mut extend = |filter: &dyn Fn(&&Target) -> bool| {
let mut iter = targets.iter().filter(filter).peekable();
let found = iter.peek().is_some();
out.extend(iter);
found
};
let found = match artifact_kind {
ArtifactKind::Cdylib => extend(&|t| t.is_cdylib()),
ArtifactKind::Staticlib => extend(&|t| t.is_staticlib()),
ArtifactKind::AllBinaries => extend(&|t| t.is_bin()),
ArtifactKind::SelectedBinary(bin_name) => {
extend(&|t| t.is_bin() && t.name() == bin_name.as_str())
}
};
if !found {
anyhow::bail!(
"dependency `{}` in package `{}` requires a `{}` artifact to be present.",
artifact_dep.name_in_toml(),
parent_package,
artifact_kind
);
}
}
Ok(out)
}
/// Returns the dependencies necessary to document a package.
@ -400,43 +639,43 @@ fn compute_deps_doc(
state: &mut State<'_, '_>,
unit_for: UnitFor,
) -> CargoResult<Vec<UnitDep>> {
let deps = state.deps(unit, unit_for);
// To document a library, we depend on dependencies actually being
// built. If we're documenting *all* libraries, then we also depend on
// the documentation of the library being built.
let mut ret = Vec::new();
for (id, _deps) in deps {
let dep = state.get(id);
let lib = match dep.targets().iter().find(|t| t.is_lib()) {
for (id, deps) in state.deps(unit, unit_for) {
let dep_lib = match calc_artifact_deps(unit, unit_for, id, &deps, state, &mut ret)? {
Some(lib) => lib,
None => continue,
};
let dep_pkg = state.get(id);
// Rustdoc only needs rmeta files for regular dependencies.
// However, for plugins/proc macros, deps should be built like normal.
let mode = check_or_build_mode(unit.mode, lib);
let dep_unit_for = unit_for.with_dependency(unit, lib);
let mode = check_or_build_mode(unit.mode, dep_lib);
let dep_unit_for = unit_for.with_dependency(unit, dep_lib, unit_for.root_compile_kind());
let lib_unit_dep = new_unit_dep(
state,
unit,
dep,
lib,
dep_pkg,
dep_lib,
dep_unit_for,
unit.kind.for_target(lib),
unit.kind.for_target(dep_lib),
mode,
IS_NO_ARTIFACT_DEP,
)?;
ret.push(lib_unit_dep);
if lib.documented() {
if dep_lib.documented() {
if let CompileMode::Doc { deps: true } = unit.mode {
// Document this lib as well.
let doc_unit_dep = new_unit_dep(
state,
unit,
dep,
lib,
dep_pkg,
dep_lib,
dep_unit_for,
unit.kind.for_target(lib),
unit.kind.for_target(dep_lib),
unit.mode,
IS_NO_ARTIFACT_DEP,
)?;
ret.push(doc_unit_dep);
}
@ -457,7 +696,7 @@ fn compute_deps_doc(
.iter()
.find(|t| t.is_linkable() && t.documented())
{
let dep_unit_for = unit_for.with_dependency(unit, lib);
let dep_unit_for = unit_for.with_dependency(unit, lib, unit_for.root_compile_kind());
let lib_doc_unit = new_unit_dep(
state,
unit,
@ -466,6 +705,7 @@ fn compute_deps_doc(
dep_unit_for,
unit.kind.for_target(lib),
unit.mode,
IS_NO_ARTIFACT_DEP,
)?;
ret.push(lib_doc_unit);
}
@ -475,7 +715,10 @@ fn compute_deps_doc(
if state.ws.is_member(&unit.pkg) {
for scrape_unit in state.scrape_units.iter() {
// This needs to match the FeaturesFor used in cargo_compile::generate_targets.
let unit_for = UnitFor::new_host(scrape_unit.target.proc_macro());
let unit_for = UnitFor::new_host(
scrape_unit.target.proc_macro(),
unit_for.root_compile_kind(),
);
deps_of(scrape_unit, state, unit_for)?;
ret.push(new_unit_dep(
state,
@ -485,6 +728,7 @@ fn compute_deps_doc(
unit_for,
scrape_unit.kind,
scrape_unit.mode,
IS_NO_ARTIFACT_DEP,
)?);
}
}
@ -503,7 +747,7 @@ fn maybe_lib(
.find(|t| t.is_linkable())
.map(|t| {
let mode = check_or_build_mode(unit.mode, t);
let dep_unit_for = unit_for.with_dependency(unit, t);
let dep_unit_for = unit_for.with_dependency(unit, t, unit_for.root_compile_kind());
new_unit_dep(
state,
unit,
@ -512,6 +756,7 @@ fn maybe_lib(
dep_unit_for,
unit.kind.for_target(t),
mode,
IS_NO_ARTIFACT_DEP,
)
})
.transpose()
@ -562,7 +807,10 @@ fn dep_build_script(
// compiled twice. I believe it is not feasible to only build it
// once because it would break a large number of scripts (they
// would think they have the wrong set of features enabled).
let script_unit_for = UnitFor::new_host(unit_for.is_for_host_features());
let script_unit_for = UnitFor::new_host(
unit_for.is_for_host_features(),
unit_for.root_compile_kind(),
);
new_unit_dep_with_profile(
state,
unit,
@ -572,6 +820,7 @@ fn dep_build_script(
unit.kind,
CompileMode::RunCustomBuild,
profile,
IS_NO_ARTIFACT_DEP,
)
})
.transpose()
@ -604,6 +853,7 @@ fn new_unit_dep(
unit_for: UnitFor,
kind: CompileKind,
mode: CompileMode,
artifact: Option<&Artifact>,
) -> CargoResult<UnitDep> {
let is_local = pkg.package_id().source_id().is_path() && !state.is_std;
let profile = state.profiles.get_profile(
@ -614,7 +864,9 @@ fn new_unit_dep(
mode,
kind,
);
new_unit_dep_with_profile(state, parent, pkg, target, unit_for, kind, mode, profile)
new_unit_dep_with_profile(
state, parent, pkg, target, unit_for, kind, mode, profile, artifact,
)
}
fn new_unit_dep_with_profile(
@ -626,25 +878,34 @@ fn new_unit_dep_with_profile(
kind: CompileKind,
mode: CompileMode,
profile: Profile,
artifact: Option<&Artifact>,
) -> CargoResult<UnitDep> {
// TODO: consider making extern_crate_name return InternedString?
let extern_crate_name = InternedString::new(&state.resolve().extern_crate_name(
let (extern_crate_name, dep_name) = state.resolve().extern_crate_name_and_dep_name(
parent.pkg.package_id(),
pkg.package_id(),
target,
)?);
)?;
let public = state
.resolve()
.is_public_dep(parent.pkg.package_id(), pkg.package_id());
let features_for = unit_for.map_to_features_for();
let features_for = unit_for.map_to_features_for(artifact);
let features = state.activated_features(pkg.package_id(), features_for);
let unit = state
.interner
.intern(pkg, target, profile, kind, mode, features, state.is_std, 0);
let unit = state.interner.intern(
pkg,
target,
profile,
kind,
mode,
features,
state.is_std,
/*dep_hash*/ 0,
artifact.map_or(IsArtifact::No, |_| IsArtifact::Yes),
);
Ok(UnitDep {
unit,
unit_for,
extern_crate_name,
dep_name,
public,
noprelude: false,
})
@ -808,51 +1069,59 @@ impl<'a, 'cfg> State<'a, 'cfg> {
}
/// Returns a filtered set of dependencies for the given unit.
fn deps(&self, unit: &Unit, unit_for: UnitFor) -> Vec<(PackageId, &HashSet<Dependency>)> {
fn deps(&self, unit: &Unit, unit_for: UnitFor) -> Vec<(PackageId, Vec<&Dependency>)> {
let pkg_id = unit.pkg.package_id();
let kind = unit.kind;
self.resolve()
.deps(pkg_id)
.filter(|&(_id, deps)| {
.filter_map(|(id, deps)| {
assert!(!deps.is_empty());
deps.iter().any(|dep| {
// If this target is a build command, then we only want build
// dependencies, otherwise we want everything *other than* build
// dependencies.
if unit.target.is_custom_build() != dep.is_build() {
return false;
}
// If this dependency is **not** a transitive dependency, then it
// only applies to test/example targets.
if !dep.is_transitive()
&& !unit.target.is_test()
&& !unit.target.is_example()
&& !unit.mode.is_doc_scrape()
&& !unit.mode.is_any_test()
{
return false;
}
// If this dependency is only available for certain platforms,
// make sure we're only enabling it for that platform.
if !self.target_data.dep_platform_activated(dep, kind) {
return false;
}
// If this is an optional dependency, and the new feature resolver
// did not enable it, don't include it.
if dep.is_optional() {
let features_for = unit_for.map_to_features_for();
if !self.is_dep_activated(pkg_id, features_for, dep.name_in_toml()) {
let deps: Vec<_> = deps
.iter()
.filter(|dep| {
// If this target is a build command, then we only want build
// dependencies, otherwise we want everything *other than* build
// dependencies.
if unit.target.is_custom_build() != dep.is_build() {
return false;
}
}
// If we've gotten past all that, then this dependency is
// actually used!
true
})
// If this dependency is **not** a transitive dependency, then it
// only applies to test/example targets.
if !dep.is_transitive()
&& !unit.target.is_test()
&& !unit.target.is_example()
&& !unit.mode.is_doc_scrape()
&& !unit.mode.is_any_test()
{
return false;
}
// If this dependency is only available for certain platforms,
// make sure we're only enabling it for that platform.
if !self.target_data.dep_platform_activated(dep, kind) {
return false;
}
// If this is an optional dependency, and the new feature resolver
// did not enable it, don't include it.
if dep.is_optional() {
let features_for = unit_for.map_to_features_for(dep.artifact());
if !self.is_dep_activated(pkg_id, features_for, dep.name_in_toml()) {
return false;
}
}
// If we've gotten past all that, then this dependency is
// actually used!
true
})
.collect();
if deps.is_empty() {
None
} else {
Some((id, deps))
}
})
.collect()
}

View File

@ -21,6 +21,12 @@ pub struct UnitDep {
pub unit_for: UnitFor,
/// The name the parent uses to refer to this dependency.
pub extern_crate_name: InternedString,
/// If `Some`, the name of the dependency if renamed in toml.
/// It's particularly interesting to artifact dependencies which rely on it
/// for naming their environment variables. Note that the `extern_crate_name`
/// cannot be used for this as it also may be the build target itself,
/// which isn't always the renamed dependency name.
pub dep_name: Option<InternedString>,
/// Whether or not this is a public dependency.
pub public: bool,
/// If `true`, the dependency should not be added to Rust's prelude.

View File

@ -3,12 +3,16 @@ use log::trace;
use semver::VersionReq;
use serde::ser;
use serde::Serialize;
use std::borrow::Cow;
use std::fmt;
use std::path::PathBuf;
use std::rc::Rc;
use crate::core::compiler::{CompileKind, CompileTarget};
use crate::core::{PackageId, SourceId, Summary};
use crate::util::errors::CargoResult;
use crate::util::interning::InternedString;
use crate::util::toml::StringOrVec;
use crate::util::OptVersionReq;
/// Information about a dependency requested by a Cargo manifest.
@ -40,6 +44,8 @@ struct Inner {
public: bool,
default_features: bool,
features: Vec<InternedString>,
// The presence of this information turns a dependency into an artifact dependency.
artifact: Option<Artifact>,
// This dependency should be used only for this platform.
// `None` means *all platforms*.
@ -57,6 +63,8 @@ struct SerializedDependency<'a> {
optional: bool,
uses_default_features: bool,
features: &'a [InternedString],
#[serde(skip_serializing_if = "Option::is_none")]
artifact: Option<&'a Artifact>,
target: Option<&'a Platform>,
/// The registry URL this dependency is from.
/// If None, then it comes from the default registry (crates.io).
@ -85,6 +93,7 @@ impl ser::Serialize for Dependency {
rename: self.explicit_name_in_toml().map(|s| s.as_str()),
registry: registry_id.as_ref().map(|sid| sid.url().as_str()),
path: self.source_id().local_path(),
artifact: self.artifact(),
}
.serialize(s)
}
@ -159,6 +168,7 @@ impl Dependency {
specified_req: false,
platform: None,
explicit_name_in_toml: None,
artifact: None,
}),
}
}
@ -403,4 +413,213 @@ impl Dependency {
}
self
}
pub(crate) fn set_artifact(&mut self, artifact: Artifact) {
Rc::make_mut(&mut self.inner).artifact = Some(artifact);
}
pub(crate) fn artifact(&self) -> Option<&Artifact> {
self.inner.artifact.as_ref()
}
/// Dependencies are potential rust libs if they are not artifacts or they are an
/// artifact which allows to be seen as library.
/// Previously, every dependency was potentially seen as library.
pub(crate) fn maybe_lib(&self) -> bool {
self.artifact().map(|a| a.is_lib).unwrap_or(true)
}
}
/// The presence of an artifact turns an ordinary dependency into an Artifact dependency.
/// As such, it will build one or more different artifacts of possibly various kinds
/// for making them available at build time for rustc invocations or runtime
/// for build scripts.
///
/// This information represents a requirement in the package this dependency refers to.
#[derive(PartialEq, Eq, Hash, Clone, Debug)]
pub struct Artifact {
inner: Rc<Vec<ArtifactKind>>,
is_lib: bool,
target: Option<ArtifactTarget>,
}
#[derive(Serialize)]
pub struct SerializedArtifact<'a> {
kinds: &'a [ArtifactKind],
lib: bool,
target: Option<&'a str>,
}
impl ser::Serialize for Artifact {
fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
where
S: ser::Serializer,
{
SerializedArtifact {
kinds: self.kinds(),
lib: self.is_lib,
target: self.target.as_ref().map(|t| match t {
ArtifactTarget::BuildDependencyAssumeTarget => "target",
ArtifactTarget::Force(target) => target.rustc_target().as_str(),
}),
}
.serialize(s)
}
}
impl Artifact {
pub(crate) fn parse(
artifacts: &StringOrVec,
is_lib: bool,
target: Option<&str>,
) -> CargoResult<Self> {
let kinds = ArtifactKind::validate(
artifacts
.iter()
.map(|s| ArtifactKind::parse(s))
.collect::<Result<Vec<_>, _>>()?,
)?;
Ok(Artifact {
inner: Rc::new(kinds),
is_lib,
target: target.map(ArtifactTarget::parse).transpose()?,
})
}
pub(crate) fn kinds(&self) -> &[ArtifactKind] {
&self.inner
}
pub(crate) fn is_lib(&self) -> bool {
self.is_lib
}
pub(crate) fn target(&self) -> Option<ArtifactTarget> {
self.target
}
}
#[derive(PartialEq, Eq, Hash, Copy, Clone, Ord, PartialOrd, Debug)]
pub enum ArtifactTarget {
/// Only applicable to build-dependencies, causing them to be built
/// for the given target (i.e. via `--target <triple>`) instead of for the host.
/// Has no effect on non-build dependencies.
BuildDependencyAssumeTarget,
/// The name of the platform triple, like `x86_64-apple-darwin`, that this
/// artifact will always be built for, no matter if it is a build,
/// normal or dev dependency.
Force(CompileTarget),
}
impl ArtifactTarget {
pub fn parse(target: &str) -> CargoResult<ArtifactTarget> {
Ok(match target {
"target" => ArtifactTarget::BuildDependencyAssumeTarget,
name => ArtifactTarget::Force(CompileTarget::new(name)?),
})
}
pub fn to_compile_kind(&self) -> Option<CompileKind> {
self.to_compile_target().map(CompileKind::Target)
}
pub fn to_compile_target(&self) -> Option<CompileTarget> {
match self {
ArtifactTarget::BuildDependencyAssumeTarget => None,
ArtifactTarget::Force(target) => Some(*target),
}
}
pub(crate) fn to_resolved_compile_kind(
&self,
root_unit_compile_kind: CompileKind,
) -> CompileKind {
match self {
ArtifactTarget::Force(target) => CompileKind::Target(*target),
ArtifactTarget::BuildDependencyAssumeTarget => root_unit_compile_kind,
}
}
pub(crate) fn to_resolved_compile_target(
&self,
root_unit_compile_kind: CompileKind,
) -> Option<CompileTarget> {
match self.to_resolved_compile_kind(root_unit_compile_kind) {
CompileKind::Host => None,
CompileKind::Target(target) => Some(target),
}
}
}
#[derive(PartialEq, Eq, Hash, Copy, Clone, Ord, PartialOrd, Debug)]
pub enum ArtifactKind {
/// We represent all binaries in this dependency
AllBinaries,
/// We represent a single binary
SelectedBinary(InternedString),
Cdylib,
Staticlib,
}
impl ser::Serialize for ArtifactKind {
fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
where
S: ser::Serializer,
{
let out: Cow<'_, str> = match *self {
ArtifactKind::AllBinaries => "bin".into(),
ArtifactKind::Staticlib => "staticlib".into(),
ArtifactKind::Cdylib => "cdylib".into(),
ArtifactKind::SelectedBinary(name) => format!("bin:{}", name.as_str()).into(),
};
out.serialize(s)
}
}
impl fmt::Display for ArtifactKind {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.write_str(match self {
ArtifactKind::Cdylib => "cdylib",
ArtifactKind::Staticlib => "staticlib",
ArtifactKind::AllBinaries => "bin",
ArtifactKind::SelectedBinary(bin_name) => return write!(f, "bin:{}", bin_name),
})
}
}
impl ArtifactKind {
fn parse(kind: &str) -> CargoResult<Self> {
Ok(match kind {
"bin" => ArtifactKind::AllBinaries,
"cdylib" => ArtifactKind::Cdylib,
"staticlib" => ArtifactKind::Staticlib,
_ => {
return kind
.strip_prefix("bin:")
.map(|bin_name| ArtifactKind::SelectedBinary(InternedString::new(bin_name)))
.ok_or_else(|| anyhow::anyhow!("'{}' is not a valid artifact specifier", kind))
}
})
}
fn validate(kinds: Vec<ArtifactKind>) -> CargoResult<Vec<ArtifactKind>> {
if kinds.iter().any(|k| matches!(k, ArtifactKind::AllBinaries))
&& kinds
.iter()
.any(|k| matches!(k, ArtifactKind::SelectedBinary(_)))
{
anyhow::bail!("Cannot specify both 'bin' and 'bin:<name>' binary artifacts, as 'bin' selects all available binaries.");
}
let mut kinds_without_dupes = kinds.clone();
kinds_without_dupes.sort();
kinds_without_dupes.dedup();
let num_dupes = kinds.len() - kinds_without_dupes.len();
if num_dupes != 0 {
anyhow::bail!(
"Found {} duplicate binary artifact{}",
num_dupes,
(num_dupes > 1).then(|| "s").unwrap_or("")
);
}
Ok(kinds)
}
}

View File

@ -625,13 +625,14 @@ macro_rules! unstable_cli_options {
unstable_cli_options!(
// Permanently unstable features:
allow_features: Option<BTreeSet<String>> = ("Allow *only* the listed unstable features"),
print_im_a_teapot: bool= (HIDDEN),
print_im_a_teapot: bool = (HIDDEN),
// All other unstable features.
// Please keep this list lexiographically ordered.
advanced_env: bool = (HIDDEN),
avoid_dev_deps: bool = ("Avoid installing dev-dependencies if possible"),
binary_dep_depinfo: bool = ("Track changes to dependency artifacts"),
bindeps: bool = ("Allow Cargo packages to depend on bin, cdylib, and staticlib crates, and use the artifacts built by those crates"),
#[serde(deserialize_with = "deserialize_build_std")]
build_std: Option<Vec<String>> = ("Enable Cargo to compile the standard library itself as part of a crate graph compilation"),
build_std_features: Option<Vec<String>> = ("Configure features enabled for the standard library itself when building the standard library"),
@ -839,6 +840,7 @@ impl CliUnstable {
"mtime-on-use" => self.mtime_on_use = parse_empty(k, v)?,
"named-profiles" => stabilized_warn(k, "1.57", STABILIZED_NAMED_PROFILES),
"binary-dep-depinfo" => self.binary_dep_depinfo = parse_empty(k, v)?,
"bindeps" => self.bindeps = parse_empty(k, v)?,
"build-std" => {
self.build_std = Some(crate::core::compiler::standard_lib::parse_unstable_flag(v))
}

View File

@ -829,6 +829,13 @@ impl Target {
}
}
pub fn is_staticlib(&self) -> bool {
match self.kind() {
TargetKind::Lib(libs) => libs.iter().any(|l| *l == CrateType::Staticlib),
_ => false,
}
}
/// Returns whether this target produces an artifact which can be linked
/// into a Rust crate.
///

View File

@ -519,7 +519,7 @@ impl<'cfg> PackageSet<'cfg> {
target_data,
force_all_targets,
);
for pkg_id in filtered_deps {
for (pkg_id, _dep) in filtered_deps {
collect_used_deps(
used,
resolve,
@ -553,20 +553,22 @@ impl<'cfg> PackageSet<'cfg> {
Ok(())
}
/// Check if there are any dependency packages that do not have any libs.
pub(crate) fn no_lib_pkgs(
/// Check if there are any dependency packages that violate artifact constraints
/// to instantly abort, or that do not have any libs which results in warnings.
pub(crate) fn warn_no_lib_packages_and_artifact_libs_overlapping_deps(
&self,
ws: &Workspace<'cfg>,
resolve: &Resolve,
root_ids: &[PackageId],
has_dev_units: HasDevUnits,
requested_kinds: &[CompileKind],
target_data: &RustcTargetData<'_>,
force_all_targets: ForceAllTargets,
) -> BTreeMap<PackageId, Vec<&Package>> {
root_ids
) -> CargoResult<()> {
let no_lib_pkgs: BTreeMap<PackageId, Vec<(&Package, &HashSet<Dependency>)>> = root_ids
.iter()
.map(|&root_id| {
let pkgs = PackageSet::filter_deps(
let dep_pkgs_to_deps: Vec<_> = PackageSet::filter_deps(
root_id,
resolve,
has_dev_units,
@ -574,21 +576,37 @@ impl<'cfg> PackageSet<'cfg> {
target_data,
force_all_targets,
)
.filter_map(|package_id| {
if let Ok(dep_pkg) = self.get_one(package_id) {
if !dep_pkg.targets().iter().any(|t| t.is_lib()) {
Some(dep_pkg)
} else {
None
}
} else {
None
}
})
.collect();
(root_id, pkgs)
let dep_pkgs_and_deps = dep_pkgs_to_deps
.into_iter()
.filter(|(_id, deps)| deps.iter().any(|dep| dep.maybe_lib()))
.filter_map(|(dep_package_id, deps)| {
self.get_one(dep_package_id).ok().and_then(|dep_pkg| {
(!dep_pkg.targets().iter().any(|t| t.is_lib())).then(|| (dep_pkg, deps))
})
})
.collect();
(root_id, dep_pkgs_and_deps)
})
.collect()
.collect();
for (pkg_id, dep_pkgs) in no_lib_pkgs {
for (_dep_pkg_without_lib_target, deps) in dep_pkgs {
for dep in deps.iter().filter(|dep| {
dep.artifact()
.map(|artifact| artifact.is_lib())
.unwrap_or(true)
}) {
ws.config().shell().warn(&format!(
"{} ignoring invalid dependency `{}` which is missing a lib target",
pkg_id,
dep.name_in_toml(),
))?;
}
}
}
Ok(())
}
fn filter_deps<'a>(
@ -598,7 +616,7 @@ impl<'cfg> PackageSet<'cfg> {
requested_kinds: &'a [CompileKind],
target_data: &'a RustcTargetData<'_>,
force_all_targets: ForceAllTargets,
) -> impl Iterator<Item = PackageId> + 'a {
) -> impl Iterator<Item = (PackageId, &'a HashSet<Dependency>)> + 'a {
resolve
.deps(pkg_id)
.filter(move |&(_id, deps)| {
@ -618,7 +636,6 @@ impl<'cfg> PackageSet<'cfg> {
true
})
})
.map(|(pkg_id, _)| pkg_id)
.into_iter()
}

View File

@ -1,4 +1,5 @@
use crate::core::compiler::{CompileKind, CompileMode, Unit};
use crate::core::compiler::{CompileKind, CompileMode, CompileTarget, Unit};
use crate::core::dependency::Artifact;
use crate::core::resolver::features::FeaturesFor;
use crate::core::{Feature, PackageId, PackageIdSpec, Resolve, Shell, Target, Workspace};
use crate::util::interning::InternedString;
@ -331,7 +332,7 @@ impl Profiles {
(self.requested_profile, None)
};
let maker = self.get_profile_maker(profile_name).unwrap();
let mut profile = maker.get_profile(Some(pkg_id), is_member, unit_for);
let mut profile = maker.get_profile(Some(pkg_id), is_member, unit_for.is_for_host());
// Dealing with `panic=abort` and `panic=unwind` requires some special
// treatment. Be sure to process all the various options here.
@ -342,7 +343,9 @@ impl Profiles {
if let Some(inherits) = inherits {
// TODO: Fixme, broken with named profiles.
let maker = self.get_profile_maker(inherits).unwrap();
profile.panic = maker.get_profile(Some(pkg_id), is_member, unit_for).panic;
profile.panic = maker
.get_profile(Some(pkg_id), is_member, unit_for.is_for_host())
.panic;
}
}
}
@ -410,7 +413,7 @@ impl Profiles {
};
let maker = self.get_profile_maker(profile_name).unwrap();
maker.get_profile(None, true, UnitFor::new_normal())
maker.get_profile(None, /*is_member*/ true, /*is_for_host*/ false)
}
/// Gets the directory name for a profile, like `debug` or `release`.
@ -498,7 +501,7 @@ impl ProfileMaker {
&self,
pkg_id: Option<PackageId>,
is_member: bool,
unit_for: UnitFor,
is_for_host: bool,
) -> Profile {
let mut profile = self.default.clone();
@ -510,7 +513,7 @@ impl ProfileMaker {
// Next start overriding those settings. First comes build dependencies
// which default to opt-level 0...
if unit_for.is_for_host() {
if is_for_host {
// For-host units are things like procedural macros, build scripts, and
// their dependencies. For these units most projects simply want them
// to compile quickly and the runtime doesn't matter too much since
@ -526,7 +529,7 @@ impl ProfileMaker {
// profiles, such as `[profile.release.build-override]` or
// `[profile.release.package.foo]`
if let Some(toml) = &self.toml {
merge_toml_overrides(pkg_id, is_member, unit_for, &mut profile, toml);
merge_toml_overrides(pkg_id, is_member, is_for_host, &mut profile, toml);
}
profile
}
@ -536,11 +539,11 @@ impl ProfileMaker {
fn merge_toml_overrides(
pkg_id: Option<PackageId>,
is_member: bool,
unit_for: UnitFor,
is_for_host: bool,
profile: &mut Profile,
toml: &TomlProfile,
) {
if unit_for.is_for_host() {
if is_for_host {
if let Some(build_override) = &toml.build_override {
merge_profile(profile, build_override);
}
@ -879,6 +882,9 @@ impl fmt::Display for Strip {
/// Flags used in creating `Unit`s to indicate the purpose for the target, and
/// to ensure the target's dependencies have the correct settings.
///
/// This means these are passed down from the root of the dependency tree to apply
/// to most child dependencies.
#[derive(Copy, Clone, Debug, Eq, PartialEq, Hash, Ord, PartialOrd)]
pub struct UnitFor {
/// A target for `build.rs` or any of its dependencies, or a proc-macro or
@ -932,6 +938,24 @@ pub struct UnitFor {
/// handle test/benches inheriting from dev/release, as well as forcing
/// `for_host` units to always unwind.
panic_setting: PanicSetting,
/// The compile kind of the root unit for which artifact dependencies are built.
/// This is required particularly for the `target = "target"` setting of artifact
/// dependencies which mean to inherit the `--target` specified on the command-line.
/// However, that is a multi-value argument and root units are already created to
/// reflect one unit per --target. Thus we have to build one artifact with the
/// correct target for each of these trees.
/// Note that this will always be set as we don't initially know if there are
/// artifacts that make use of it.
root_compile_kind: CompileKind,
/// This is only set for artifact dependencies which have their
/// `<target-triple>|target` set.
/// If so, this information is used as part of the key for resolving their features,
/// allowing for target-dependent feature resolution within the entire dependency tree.
/// Note that this target corresponds to the target used to build the units in that
/// dependency tree, too, but this copy of it is specifically used for feature lookup.
artifact_target_for_features: Option<CompileTarget>,
}
#[derive(Copy, Clone, Debug, Eq, PartialEq, Hash, Ord, PartialOrd)]
@ -952,11 +976,13 @@ enum PanicSetting {
impl UnitFor {
/// A unit for a normal target/dependency (i.e., not custom build,
/// proc macro/plugin, or test/bench).
pub fn new_normal() -> UnitFor {
pub fn new_normal(root_compile_kind: CompileKind) -> UnitFor {
UnitFor {
host: false,
host_features: false,
panic_setting: PanicSetting::ReadProfile,
root_compile_kind,
artifact_target_for_features: None,
}
}
@ -966,18 +992,20 @@ impl UnitFor {
/// dependency or proc-macro (something that requires being built "on the
/// host"). Build scripts for non-host units should use `false` because
/// they want to use the features of the package they are running for.
pub fn new_host(host_features: bool) -> UnitFor {
pub fn new_host(host_features: bool, root_compile_kind: CompileKind) -> UnitFor {
UnitFor {
host: true,
host_features,
// Force build scripts to always use `panic=unwind` for now to
// maximally share dependencies with procedural macros.
panic_setting: PanicSetting::AlwaysUnwind,
root_compile_kind,
artifact_target_for_features: None,
}
}
/// A unit for a compiler plugin or their dependencies.
pub fn new_compiler() -> UnitFor {
pub fn new_compiler(root_compile_kind: CompileKind) -> UnitFor {
UnitFor {
host: false,
// The feature resolver doesn't know which dependencies are
@ -988,6 +1016,8 @@ impl UnitFor {
// not abort the process but instead end with a reasonable error
// message that involves catching the panic in the compiler.
panic_setting: PanicSetting::AlwaysUnwind,
root_compile_kind,
artifact_target_for_features: None,
}
}
@ -997,7 +1027,7 @@ impl UnitFor {
/// whether `panic=abort` is supported for tests. Historical versions of
/// rustc did not support this, but newer versions do with an unstable
/// compiler flag.
pub fn new_test(config: &Config) -> UnitFor {
pub fn new_test(config: &Config, root_compile_kind: CompileKind) -> UnitFor {
UnitFor {
host: false,
host_features: false,
@ -1010,14 +1040,16 @@ impl UnitFor {
} else {
PanicSetting::AlwaysUnwind
},
root_compile_kind,
artifact_target_for_features: None,
}
}
/// This is a special case for unit tests of a proc-macro.
///
/// Proc-macro unit tests are forced to be run on the host.
pub fn new_host_test(config: &Config) -> UnitFor {
let mut unit_for = UnitFor::new_test(config);
pub fn new_host_test(config: &Config, root_compile_kind: CompileKind) -> UnitFor {
let mut unit_for = UnitFor::new_test(config, root_compile_kind);
unit_for.host = true;
unit_for.host_features = true;
unit_for
@ -1029,7 +1061,12 @@ impl UnitFor {
/// transition in a sticky fashion. As the dependency graph is being
/// built, once those flags are set, they stay set for the duration of
/// that portion of tree.
pub fn with_dependency(self, parent: &Unit, dep_target: &Target) -> UnitFor {
pub fn with_dependency(
self,
parent: &Unit,
dep_target: &Target,
root_compile_kind: CompileKind,
) -> UnitFor {
// A build script or proc-macro transitions this to being built for the host.
let dep_for_host = dep_target.for_host();
// This is where feature decoupling of host versus target happens.
@ -1054,9 +1091,29 @@ impl UnitFor {
host: self.host || dep_for_host,
host_features,
panic_setting,
root_compile_kind,
artifact_target_for_features: self.artifact_target_for_features,
}
}
/// Set the artifact compile target for use in features using the given `artifact`.
pub(crate) fn with_artifact_features(mut self, artifact: &Artifact) -> UnitFor {
self.artifact_target_for_features = artifact.target().and_then(|t| t.to_compile_target());
self
}
/// Set the artifact compile target as determined by a resolved compile target. This is used if `target = "target"`.
pub(crate) fn with_artifact_features_from_resolved_compile_kind(
mut self,
kind: Option<CompileKind>,
) -> UnitFor {
self.artifact_target_for_features = kind.and_then(|kind| match kind {
CompileKind::Host => None,
CompileKind::Target(triple) => Some(triple),
});
self
}
/// Returns `true` if this unit is for a build script or any of its
/// dependencies, or a proc macro or any of its dependencies.
pub fn is_for_host(&self) -> bool {
@ -1072,47 +1129,25 @@ impl UnitFor {
self.panic_setting
}
/// All possible values, used by `clean`.
pub fn all_values() -> &'static [UnitFor] {
static ALL: &[UnitFor] = &[
UnitFor {
host: false,
host_features: false,
panic_setting: PanicSetting::ReadProfile,
/// We might contain a parent artifact compile kind for features already, but will
/// gladly accept the one of this dependency as an override as it defines how
/// the artifact is built.
/// If we are an artifact but don't specify a `target`, we assume the default
/// compile kind that is suitable in this situation.
pub(crate) fn map_to_features_for(&self, dep_artifact: Option<&Artifact>) -> FeaturesFor {
FeaturesFor::from_for_host_or_artifact_target(
self.is_for_host_features(),
match dep_artifact {
Some(artifact) => artifact
.target()
.and_then(|t| t.to_resolved_compile_target(self.root_compile_kind)),
None => self.artifact_target_for_features,
},
UnitFor {
host: true,
host_features: false,
panic_setting: PanicSetting::AlwaysUnwind,
},
UnitFor {
host: false,
host_features: false,
panic_setting: PanicSetting::AlwaysUnwind,
},
UnitFor {
host: false,
host_features: false,
panic_setting: PanicSetting::Inherit,
},
// host_features=true must always have host=true
// `Inherit` is not used in build dependencies.
UnitFor {
host: true,
host_features: true,
panic_setting: PanicSetting::ReadProfile,
},
UnitFor {
host: true,
host_features: true,
panic_setting: PanicSetting::AlwaysUnwind,
},
];
ALL
)
}
pub(crate) fn map_to_features_for(&self) -> FeaturesFor {
FeaturesFor::from_for_host(self.is_for_host_features())
pub(crate) fn root_compile_kind(&self) -> CompileKind {
self.root_compile_kind
}
}

View File

@ -30,8 +30,8 @@
//! within a dependency have been removed. There are probably other
//! assumptions that I am forgetting.
use crate::core::compiler::{CompileKind, RustcTargetData};
use crate::core::dependency::{DepKind, Dependency};
use crate::core::compiler::{CompileKind, CompileTarget, RustcTargetData};
use crate::core::dependency::{ArtifactTarget, DepKind, Dependency};
use crate::core::resolver::types::FeaturesSet;
use crate::core::resolver::{Resolve, ResolveBehavior};
use crate::core::{FeatureValue, PackageId, PackageIdSpec, PackageSet, Workspace};
@ -41,11 +41,14 @@ use anyhow::bail;
use std::collections::{BTreeMap, BTreeSet, HashMap, HashSet};
use std::rc::Rc;
/// The key used in various places to store features for a particular dependency.
/// The actual discrimination happens with the `FeaturesFor` type.
type PackageFeaturesKey = (PackageId, FeaturesFor);
/// Map of activated features.
///
/// The key is `(PackageId, bool)` where the bool is `true` if these
/// are features for a build dependency or proc-macro.
type ActivateMap = HashMap<(PackageId, bool), BTreeSet<InternedString>>;
type ActivateMap = HashMap<PackageFeaturesKey, BTreeSet<InternedString>>;
/// Set of all activated features for all packages in the resolve graph.
pub struct ResolvedFeatures {
@ -60,7 +63,12 @@ pub struct ResolvedFeatures {
/// Options for how the feature resolver works.
#[derive(Default)]
pub struct FeatureOpts {
/// Build deps and proc-macros will not share share features with other dep kinds.
/// Build deps and proc-macros will not share share features with other dep kinds,
/// and so won't artifact targets.
/// In other terms, if true, features associated with certain kinds of dependencies
/// will only be unified together.
/// If false, there is only one namespace for features, unifying all features across
/// all dependencies, no matter what kind.
decouple_host_deps: bool,
/// Dev dep features will not be activated unless needed.
decouple_dev_deps: bool,
@ -91,19 +99,63 @@ pub enum ForceAllTargets {
}
/// Flag to indicate if features are requested for a build dependency or not.
#[derive(Copy, Clone, Debug, PartialEq)]
#[derive(Copy, Clone, Debug, PartialEq, Eq, Ord, PartialOrd, Hash)]
pub enum FeaturesFor {
NormalOrDev,
/// If `Some(target)` is present, we represent an artifact target.
/// Otherwise any other normal or dev dependency.
NormalOrDevOrArtifactTarget(Option<CompileTarget>),
/// Build dependency or proc-macro.
HostDep,
}
impl Default for FeaturesFor {
fn default() -> Self {
FeaturesFor::NormalOrDevOrArtifactTarget(None)
}
}
impl std::fmt::Display for FeaturesFor {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
FeaturesFor::HostDep => f.write_str("host"),
FeaturesFor::NormalOrDevOrArtifactTarget(Some(target)) => {
f.write_str(&target.rustc_target())
}
FeaturesFor::NormalOrDevOrArtifactTarget(None) => Ok(()),
}
}
}
impl FeaturesFor {
pub fn from_for_host(for_host: bool) -> FeaturesFor {
if for_host {
FeaturesFor::HostDep
} else {
FeaturesFor::NormalOrDev
FeaturesFor::NormalOrDevOrArtifactTarget(None)
}
}
pub fn from_for_host_or_artifact_target(
for_host: bool,
artifact_target: Option<CompileTarget>,
) -> FeaturesFor {
match artifact_target {
Some(target) => FeaturesFor::NormalOrDevOrArtifactTarget(Some(target)),
None => {
if for_host {
FeaturesFor::HostDep
} else {
FeaturesFor::NormalOrDevOrArtifactTarget(None)
}
}
}
}
fn apply_opts(self, opts: &FeatureOpts) -> Self {
if opts.decouple_host_deps {
self
} else {
FeaturesFor::default()
}
}
}
@ -276,9 +328,9 @@ impl ResolvedFeatures {
features_for: FeaturesFor,
dep_name: InternedString,
) -> bool {
let is_build = self.opts.decouple_host_deps && features_for == FeaturesFor::HostDep;
let key = features_for.apply_opts(&self.opts);
self.activated_dependencies
.get(&(pkg_id, is_build))
.get(&(pkg_id, key))
.map(|deps| deps.contains(&dep_name))
.unwrap_or(false)
}
@ -299,11 +351,11 @@ impl ResolvedFeatures {
pkg_id: PackageId,
features_for: FeaturesFor,
) -> CargoResult<Vec<InternedString>> {
let is_build = self.opts.decouple_host_deps && features_for == FeaturesFor::HostDep;
if let Some(fs) = self.activated_features.get(&(pkg_id, is_build)) {
let fk = features_for.apply_opts(&self.opts);
if let Some(fs) = self.activated_features.get(&(pkg_id, fk)) {
Ok(fs.iter().cloned().collect())
} else {
bail!("features did not find {:?} {:?}", pkg_id, is_build)
bail!("features did not find {:?} {:?}", pkg_id, fk)
}
}
@ -318,7 +370,11 @@ impl ResolvedFeatures {
.activated_features
.get(&(*pkg_id, *for_host))
// The new features may have for_host entries where the old one does not.
.or_else(|| legacy.activated_features.get(&(*pkg_id, false)))
.or_else(|| {
legacy
.activated_features
.get(&(*pkg_id, FeaturesFor::default()))
})
.map(|feats| feats.iter().cloned().collect())
.unwrap_or_else(|| BTreeSet::new());
// The new resolver should never add features.
@ -338,7 +394,7 @@ impl ResolvedFeatures {
/// Map of differences.
///
/// Key is `(pkg_id, for_host)`. Value is a set of features or dependencies removed.
pub type DiffMap = BTreeMap<(PackageId, bool), BTreeSet<InternedString>>;
pub type DiffMap = BTreeMap<PackageFeaturesKey, BTreeSet<InternedString>>;
pub struct FeatureResolver<'a, 'cfg> {
ws: &'a Workspace<'cfg>,
@ -355,8 +411,8 @@ pub struct FeatureResolver<'a, 'cfg> {
activated_dependencies: ActivateMap,
/// Keeps track of which packages have had its dependencies processed.
/// Used to avoid cycles, and to speed up processing.
processed_deps: HashSet<(PackageId, bool)>,
/// If this is `true`, then `for_host` needs to be tracked while
processed_deps: HashSet<PackageFeaturesKey>,
/// If this is `true`, then a non-default `feature_key` needs to be tracked while
/// traversing the graph.
///
/// This is only here to avoid calling `is_proc_macro` when all feature
@ -370,7 +426,8 @@ pub struct FeatureResolver<'a, 'cfg> {
/// The key is the `(package, for_host, dep_name)` of the package whose
/// dependency will trigger the addition of new features. The value is the
/// set of features to activate.
deferred_weak_dependencies: HashMap<(PackageId, bool, InternedString), HashSet<InternedString>>,
deferred_weak_dependencies:
HashMap<(PackageId, FeaturesFor, InternedString), HashSet<InternedString>>,
}
impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
@ -423,18 +480,20 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
let member_features = self.ws.members_with_features(specs, cli_features)?;
for (member, cli_features) in &member_features {
let fvs = self.fvs_from_requested(member.package_id(), cli_features);
let for_host = self.track_for_host && self.is_proc_macro(member.package_id());
self.activate_pkg(member.package_id(), for_host, &fvs)?;
if for_host {
// Also activate without for_host. This is needed if the
let fk = if self.track_for_host && self.is_proc_macro(member.package_id()) {
// Also activate for normal dependencies. This is needed if the
// proc-macro includes other targets (like binaries or tests),
// or running in `cargo test`. Note that in a workspace, if
// the proc-macro is selected on the command like (like with
// `--workspace`), this forces feature unification with normal
// dependencies. This is part of the bigger problem where
// features depend on which packages are built.
self.activate_pkg(member.package_id(), false, &fvs)?;
}
self.activate_pkg(member.package_id(), FeaturesFor::default(), &fvs)?;
FeaturesFor::HostDep
} else {
FeaturesFor::default()
};
self.activate_pkg(member.package_id(), fk, &fvs)?;
}
Ok(())
}
@ -442,20 +501,20 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
fn activate_pkg(
&mut self,
pkg_id: PackageId,
for_host: bool,
fk: FeaturesFor,
fvs: &[FeatureValue],
) -> CargoResult<()> {
log::trace!("activate_pkg {} {}", pkg_id.name(), for_host);
log::trace!("activate_pkg {} {}", pkg_id.name(), fk);
// Add an empty entry to ensure everything is covered. This is intended for
// finding bugs where the resolver missed something it should have visited.
// Remove this in the future if `activated_features` uses an empty default.
self.activated_features
.entry((pkg_id, self.opts.decouple_host_deps && for_host))
.entry((pkg_id, fk.apply_opts(&self.opts)))
.or_insert_with(BTreeSet::new);
for fv in fvs {
self.activate_fv(pkg_id, for_host, fv)?;
self.activate_fv(pkg_id, fk, fv)?;
}
if !self.processed_deps.insert((pkg_id, for_host)) {
if !self.processed_deps.insert((pkg_id, fk)) {
// Already processed dependencies. There's no need to process them
// again. This is primarily to avoid cycles, but also helps speed
// things up.
@ -471,8 +530,8 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
// features that enable other features.
return Ok(());
}
for (dep_pkg_id, deps) in self.deps(pkg_id, for_host) {
for (dep, dep_for_host) in deps {
for (dep_pkg_id, deps) in self.deps(pkg_id, fk) {
for (dep, dep_fk) in deps {
if dep.is_optional() {
// Optional dependencies are enabled in `activate_fv` when
// a feature enables it.
@ -480,7 +539,7 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
}
// Recurse into the dependency.
let fvs = self.fvs_from_dependency(dep_pkg_id, dep);
self.activate_pkg(dep_pkg_id, dep_for_host, &fvs)?;
self.activate_pkg(dep_pkg_id, dep_fk, &fvs)?;
}
}
Ok(())
@ -490,23 +549,23 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
fn activate_fv(
&mut self,
pkg_id: PackageId,
for_host: bool,
fk: FeaturesFor,
fv: &FeatureValue,
) -> CargoResult<()> {
log::trace!("activate_fv {} {} {}", pkg_id.name(), for_host, fv);
log::trace!("activate_fv {} {} {}", pkg_id.name(), fk, fv);
match fv {
FeatureValue::Feature(f) => {
self.activate_rec(pkg_id, for_host, *f)?;
self.activate_rec(pkg_id, fk, *f)?;
}
FeatureValue::Dep { dep_name } => {
self.activate_dependency(pkg_id, for_host, *dep_name)?;
self.activate_dependency(pkg_id, fk, *dep_name)?;
}
FeatureValue::DepFeature {
dep_name,
dep_feature,
weak,
} => {
self.activate_dep_feature(pkg_id, for_host, *dep_name, *dep_feature, *weak)?;
self.activate_dep_feature(pkg_id, fk, *dep_name, *dep_feature, *weak)?;
}
}
Ok(())
@ -517,18 +576,18 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
fn activate_rec(
&mut self,
pkg_id: PackageId,
for_host: bool,
fk: FeaturesFor,
feature_to_enable: InternedString,
) -> CargoResult<()> {
log::trace!(
"activate_rec {} {} feat={}",
pkg_id.name(),
for_host,
fk,
feature_to_enable
);
let enabled = self
.activated_features
.entry((pkg_id, self.opts.decouple_host_deps && for_host))
.entry((pkg_id, fk.apply_opts(&self.opts)))
.or_insert_with(BTreeSet::new);
if !enabled.insert(feature_to_enable) {
// Already enabled.
@ -551,7 +610,7 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
}
};
for fv in fvs {
self.activate_fv(pkg_id, for_host, fv)?;
self.activate_fv(pkg_id, fk, fv)?;
}
Ok(())
}
@ -560,22 +619,22 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
fn activate_dependency(
&mut self,
pkg_id: PackageId,
for_host: bool,
fk: FeaturesFor,
dep_name: InternedString,
) -> CargoResult<()> {
// Mark this dependency as activated.
let save_for_host = self.opts.decouple_host_deps && for_host;
let save_decoupled = fk.apply_opts(&self.opts);
self.activated_dependencies
.entry((pkg_id, save_for_host))
.entry((pkg_id, save_decoupled))
.or_default()
.insert(dep_name);
// Check for any deferred features.
let to_enable = self
.deferred_weak_dependencies
.remove(&(pkg_id, for_host, dep_name));
.remove(&(pkg_id, fk, dep_name));
// Activate the optional dep.
for (dep_pkg_id, deps) in self.deps(pkg_id, for_host) {
for (dep, dep_for_host) in deps {
for (dep_pkg_id, deps) in self.deps(pkg_id, fk) {
for (dep, dep_fk) in deps {
if dep.name_in_toml() != dep_name {
continue;
}
@ -584,16 +643,16 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
log::trace!(
"activate deferred {} {} -> {}/{}",
pkg_id.name(),
for_host,
fk,
dep_name,
dep_feature
);
let fv = FeatureValue::new(*dep_feature);
self.activate_fv(dep_pkg_id, dep_for_host, &fv)?;
self.activate_fv(dep_pkg_id, dep_fk, &fv)?;
}
}
let fvs = self.fvs_from_dependency(dep_pkg_id, dep);
self.activate_pkg(dep_pkg_id, dep_for_host, &fvs)?;
self.activate_pkg(dep_pkg_id, dep_fk, &fvs)?;
}
}
Ok(())
@ -603,18 +662,18 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
fn activate_dep_feature(
&mut self,
pkg_id: PackageId,
for_host: bool,
fk: FeaturesFor,
dep_name: InternedString,
dep_feature: InternedString,
weak: bool,
) -> CargoResult<()> {
for (dep_pkg_id, deps) in self.deps(pkg_id, for_host) {
for (dep, dep_for_host) in deps {
for (dep_pkg_id, deps) in self.deps(pkg_id, fk) {
for (dep, dep_fk) in deps {
if dep.name_in_toml() != dep_name {
continue;
}
if dep.is_optional() {
let save_for_host = self.opts.decouple_host_deps && for_host;
let save_for_host = fk.apply_opts(&self.opts);
if weak
&& !self
.activated_dependencies
@ -627,12 +686,12 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
log::trace!(
"deferring feature {} {} -> {}/{}",
pkg_id.name(),
for_host,
fk,
dep_name,
dep_feature
);
self.deferred_weak_dependencies
.entry((pkg_id, for_host, dep_name))
.entry((pkg_id, fk, dep_name))
.or_default()
.insert(dep_feature);
continue;
@ -640,17 +699,17 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
// Activate the dependency on self.
let fv = FeatureValue::Dep { dep_name };
self.activate_fv(pkg_id, for_host, &fv)?;
self.activate_fv(pkg_id, fk, &fv)?;
if !weak {
// The old behavior before weak dependencies were
// added is to also enables a feature of the same
// name.
self.activate_rec(pkg_id, for_host, dep_name)?;
self.activate_rec(pkg_id, fk, dep_name)?;
}
}
// Activate the feature on the dependency.
let fv = FeatureValue::new(dep_feature);
self.activate_fv(dep_pkg_id, dep_for_host, &fv)?;
self.activate_fv(dep_pkg_id, dep_fk, &fv)?;
}
}
Ok(())
@ -698,14 +757,14 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
fn deps(
&self,
pkg_id: PackageId,
for_host: bool,
) -> Vec<(PackageId, Vec<(&'a Dependency, bool)>)> {
fk: FeaturesFor,
) -> Vec<(PackageId, Vec<(&'a Dependency, FeaturesFor)>)> {
// Helper for determining if a platform is activated.
let platform_activated = |dep: &Dependency| -> bool {
// We always care about build-dependencies, and they are always
// Host. If we are computing dependencies "for a build script",
// even normal dependencies are host-only.
if for_host || dep.is_build() {
if fk == FeaturesFor::HostDep || dep.is_build() {
return self
.target_data
.dep_platform_activated(dep, CompileKind::Host);
@ -732,10 +791,80 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
}
true
})
.map(|dep| {
let dep_for_host = self.track_for_host
&& (for_host || dep.is_build() || self.is_proc_macro(dep_id));
(dep, dep_for_host)
.flat_map(|dep| {
// Each `dep`endency can be built for multiple targets. For one, it
// may be a library target which is built as initially configured
// by `fk`. If it appears as build dependency, it must be built
// for the host.
//
// It may also be an artifact dependency,
// which could be built either
//
// - for a specified (aka 'forced') target, specified by
// `dep = { …, target = <triple>` }`
// - as an artifact for use in build dependencies that should
// build for whichever `--target`s are specified
// - like a library would be built
//
// Generally, the logic for choosing a target for dependencies is
// unaltered and used to determine how to build non-artifacts,
// artifacts without target specification and no library,
// or an artifacts library.
//
// All this may result in a dependency being built multiple times
// for various targets which are either specified in the manifest
// or on the cargo command-line.
let lib_fk = if fk == FeaturesFor::default() {
(self.track_for_host && (dep.is_build() || self.is_proc_macro(dep_id)))
.then(|| FeaturesFor::HostDep)
.unwrap_or_default()
} else {
fk
};
// `artifact_target_keys` are produced to fulfil the needs of artifacts that have a target specification.
let artifact_target_keys = dep.artifact().map(|artifact| {
(
artifact.is_lib(),
artifact.target().map(|target| match target {
ArtifactTarget::Force(target) => {
vec![FeaturesFor::NormalOrDevOrArtifactTarget(Some(target))]
}
ArtifactTarget::BuildDependencyAssumeTarget => self
.requested_targets
.iter()
.filter_map(|kind| match kind {
CompileKind::Host => None,
CompileKind::Target(target) => {
Some(FeaturesFor::NormalOrDevOrArtifactTarget(
Some(*target),
))
}
})
.collect(),
}),
)
});
let dep_fks = match artifact_target_keys {
// The artifact is also a library and does specify custom
// targets.
// The library's feature key needs to be used alongside
// the keys artifact targets.
Some((is_lib, Some(mut dep_fks))) if is_lib => {
dep_fks.push(lib_fk);
dep_fks
}
// The artifact is not a library, but does specify
// custom targets.
// Use only these targets feature keys.
Some((_, Some(dep_fks))) => dep_fks,
// There is no artifact in the current dependency
// or there is no target specified on the artifact.
// Use the standard feature key without any alteration.
Some((_, None)) | None => vec![lib_fk],
};
dep_fks.into_iter().map(move |dep_fk| (dep, dep_fk))
})
.collect::<Vec<_>>();
(dep_id, deps)

View File

@ -295,12 +295,12 @@ unable to verify that `{0}` is the same as when the lockfile was generated
&self.metadata
}
pub fn extern_crate_name(
pub fn extern_crate_name_and_dep_name(
&self,
from: PackageId,
to: PackageId,
to_target: &Target,
) -> CargoResult<String> {
) -> CargoResult<(InternedString, Option<InternedString>)> {
let empty_set: HashSet<Dependency> = HashSet::new();
let deps = if from == to {
&empty_set
@ -308,22 +308,22 @@ unable to verify that `{0}` is the same as when the lockfile was generated
self.dependencies_listed(from, to)
};
let crate_name = to_target.crate_name();
let mut names = deps.iter().map(|d| {
let target_crate_name = || (to_target.crate_name(), None);
let mut name_pairs = deps.iter().map(|d| {
d.explicit_name_in_toml()
.map(|s| s.as_str().replace("-", "_"))
.unwrap_or_else(|| crate_name.clone())
.map(|s| (s.as_str().replace("-", "_"), Some(s)))
.unwrap_or_else(target_crate_name)
});
let name = names.next().unwrap_or_else(|| crate_name.clone());
for n in names {
let (extern_crate_name, dep_name) = name_pairs.next().unwrap_or_else(target_crate_name);
for (n, _) in name_pairs {
anyhow::ensure!(
n == name,
n == extern_crate_name,
"the crate `{}` depends on crate `{}` multiple times with different names",
from,
to,
);
}
Ok(name)
Ok((extern_crate_name.into(), dep_name))
}
fn dependencies_listed(&self, from: PackageId, to: PackageId) -> &HashSet<Dependency> {

View File

@ -26,7 +26,7 @@ use std::collections::{BTreeSet, HashMap, HashSet};
use std::hash::{Hash, Hasher};
use std::sync::Arc;
use crate::core::compiler::unit_dependencies::build_unit_dependencies;
use crate::core::compiler::unit_dependencies::{build_unit_dependencies, IsArtifact};
use crate::core::compiler::unit_graph::{self, UnitDep, UnitGraph};
use crate::core::compiler::{standard_lib, TargetInfo};
use crate::core::compiler::{BuildConfig, BuildContext, Compilation, Context};
@ -996,9 +996,67 @@ fn generate_targets(
) -> CargoResult<Vec<Unit>> {
let config = ws.config();
// Helper for creating a list of `Unit` structures
let new_unit =
|units: &mut HashSet<Unit>, pkg: &Package, target: &Target, target_mode: CompileMode| {
let unit_for = if target_mode.is_any_test() {
let new_unit = |units: &mut HashSet<Unit>,
pkg: &Package,
target: &Target,
initial_target_mode: CompileMode| {
// Custom build units are added in `build_unit_dependencies`.
assert!(!target.is_custom_build());
let target_mode = match initial_target_mode {
CompileMode::Test => {
if target.is_example() && !filter.is_specific() && !target.tested() {
// Examples are included as regular binaries to verify
// that they compile.
CompileMode::Build
} else {
CompileMode::Test
}
}
CompileMode::Build => match *target.kind() {
TargetKind::Test => CompileMode::Test,
TargetKind::Bench => CompileMode::Bench,
_ => CompileMode::Build,
},
// `CompileMode::Bench` is only used to inform `filter_default_targets`
// which command is being used (`cargo bench`). Afterwards, tests
// and benches are treated identically. Switching the mode allows
// de-duplication of units that are essentially identical. For
// example, `cargo build --all-targets --release` creates the units
// (lib profile:bench, mode:test) and (lib profile:bench, mode:bench)
// and since these are the same, we want them to be de-duplicated in
// `unit_dependencies`.
CompileMode::Bench => CompileMode::Test,
_ => initial_target_mode,
};
let is_local = pkg.package_id().source_id().is_path();
// No need to worry about build-dependencies, roots are never build dependencies.
let features_for = FeaturesFor::from_for_host(target.proc_macro());
let features = resolved_features.activated_features(pkg.package_id(), features_for);
// If `--target` has not been specified, then the unit
// graph is built almost like if `--target $HOST` was
// specified. See `rebuild_unit_graph_shared` for more on
// why this is done. However, if the package has its own
// `package.target` key, then this gets used instead of
// `$HOST`
let explicit_kinds = if let Some(k) = pkg.manifest().forced_kind() {
vec![k]
} else {
requested_kinds
.iter()
.map(|kind| match kind {
CompileKind::Host => {
pkg.manifest().default_kind().unwrap_or(explicit_host_kind)
}
CompileKind::Target(t) => CompileKind::Target(*t),
})
.collect()
};
for kind in explicit_kinds.iter() {
let unit_for = if initial_target_mode.is_any_test() {
// NOTE: the `UnitFor` here is subtle. If you have a profile
// with `panic` set, the `panic` flag is cleared for
// tests/benchmarks and their dependencies. If this
@ -1017,90 +1075,35 @@ fn generate_targets(
//
// Forcing the lib to be compiled three times during `cargo
// test` is probably also not desirable.
UnitFor::new_test(config)
UnitFor::new_test(config, *kind)
} else if target.for_host() {
// Proc macro / plugin should not have `panic` set.
UnitFor::new_compiler()
UnitFor::new_compiler(*kind)
} else {
UnitFor::new_normal()
UnitFor::new_normal(*kind)
};
// Custom build units are added in `build_unit_dependencies`.
assert!(!target.is_custom_build());
let target_mode = match target_mode {
CompileMode::Test => {
if target.is_example() && !filter.is_specific() && !target.tested() {
// Examples are included as regular binaries to verify
// that they compile.
CompileMode::Build
} else {
CompileMode::Test
}
}
CompileMode::Build => match *target.kind() {
TargetKind::Test => CompileMode::Test,
TargetKind::Bench => CompileMode::Bench,
_ => CompileMode::Build,
},
// `CompileMode::Bench` is only used to inform `filter_default_targets`
// which command is being used (`cargo bench`). Afterwards, tests
// and benches are treated identically. Switching the mode allows
// de-duplication of units that are essentially identical. For
// example, `cargo build --all-targets --release` creates the units
// (lib profile:bench, mode:test) and (lib profile:bench, mode:bench)
// and since these are the same, we want them to be de-duplicated in
// `unit_dependencies`.
CompileMode::Bench => CompileMode::Test,
_ => target_mode,
};
let is_local = pkg.package_id().source_id().is_path();
// No need to worry about build-dependencies, roots are never build dependencies.
let features_for = FeaturesFor::from_for_host(target.proc_macro());
let features = resolved_features.activated_features(pkg.package_id(), features_for);
// If `--target` has not been specified, then the unit
// graph is built almost like if `--target $HOST` was
// specified. See `rebuild_unit_graph_shared` for more on
// why this is done. However, if the package has its own
// `package.target` key, then this gets used instead of
// `$HOST`
let explicit_kinds = if let Some(k) = pkg.manifest().forced_kind() {
vec![k]
} else {
requested_kinds
.iter()
.map(|kind| match kind {
CompileKind::Host => {
pkg.manifest().default_kind().unwrap_or(explicit_host_kind)
}
CompileKind::Target(t) => CompileKind::Target(*t),
})
.collect()
};
for kind in explicit_kinds.iter() {
let profile = profiles.get_profile(
pkg.package_id(),
ws.is_member(pkg),
is_local,
unit_for,
target_mode,
*kind,
);
let unit = interner.intern(
pkg,
target,
profile,
kind.for_target(target),
target_mode,
features.clone(),
/*is_std*/ false,
/*dep_hash*/ 0,
);
units.insert(unit);
}
};
let profile = profiles.get_profile(
pkg.package_id(),
ws.is_member(pkg),
is_local,
unit_for,
target_mode,
*kind,
);
let unit = interner.intern(
pkg,
target,
profile,
kind.for_target(target),
target_mode,
features.clone(),
/*is_std*/ false,
/*dep_hash*/ 0,
IsArtifact::No,
);
units.insert(unit);
}
};
// Create a list of proposed targets.
let mut proposals: Vec<Proposal<'_>> = Vec::new();
@ -1429,7 +1432,7 @@ pub fn resolve_all_features(
package_id: PackageId,
) -> HashSet<String> {
let mut features: HashSet<String> = resolved_features
.activated_features(package_id, FeaturesFor::NormalOrDev)
.activated_features(package_id, FeaturesFor::NormalOrDevOrArtifactTarget(None))
.iter()
.map(|s| s.to_string())
.collect();
@ -1658,6 +1661,7 @@ fn traverse_and_share(
unit.features.clone(),
unit.is_std,
new_dep_hash,
unit.artifact,
);
assert!(memo.insert(unit.clone(), new_unit.clone()).is_none());
new_graph.entry(new_unit.clone()).or_insert(new_deps);

View File

@ -81,7 +81,7 @@ struct MetadataResolveNode {
#[derive(Serialize)]
struct Dep {
name: String,
name: InternedString,
pkg: PackageId,
dep_kinds: Vec<DepKindInfo>,
}
@ -211,7 +211,12 @@ fn build_resolve_graph_r(
package_map
.get(&dep_id)
.and_then(|pkg| pkg.targets().iter().find(|t| t.is_lib()))
.and_then(|lib_target| resolve.extern_crate_name(pkg_id, dep_id, lib_target).ok())
.and_then(|lib_target| {
resolve
.extern_crate_name_and_dep_name(pkg_id, dep_id, lib_target)
.map(|(ext_crate_name, _)| ext_crate_name)
.ok()
})
.map(|name| Dep {
name,
pkg: normalize_id(dep_id),

View File

@ -165,6 +165,7 @@ fn run_doc_tests(
unit,
linker,
script_meta,
env,
} = doctest_info;
if !doctest_xcompile {
@ -191,6 +192,10 @@ fn run_doc_tests(
config.shell().status("Doc-tests", unit.target.name())?;
let mut p = compilation.rustdoc_process(unit, *script_meta)?;
for (var, value) in env {
p.env(var, value);
}
p.arg("--crate-name").arg(&unit.target.crate_name());
p.arg("--test");

View File

@ -53,7 +53,7 @@ use rustfix::{self, CodeFix};
use semver::Version;
use crate::core::compiler::RustcTargetData;
use crate::core::resolver::features::{DiffMap, FeatureOpts, FeatureResolver};
use crate::core::resolver::features::{DiffMap, FeatureOpts, FeatureResolver, FeaturesFor};
use crate::core::resolver::{HasDevUnits, Resolve, ResolveBehavior};
use crate::core::{Edition, MaybePackage, PackageId, Workspace};
use crate::ops::resolve::WorkspaceResolve;
@ -284,9 +284,9 @@ fn check_resolver_change(ws: &Workspace<'_>, opts: &FixOptions) -> CargoResult<(
the given features will no longer be used:\n"
);
let show_diffs = |differences: DiffMap| {
for ((pkg_id, for_host), removed) in differences {
for ((pkg_id, features_for), removed) in differences {
drop_eprint!(config, " {}", pkg_id);
if for_host {
if let FeaturesFor::HostDep = features_for {
drop_eprint!(config, " (as host dependency)");
}
drop_eprint!(config, " removed features: ");

View File

@ -171,23 +171,15 @@ pub fn resolve_ws_with_opts<'cfg>(
feature_opts,
)?;
let no_lib_pkgs = pkg_set.no_lib_pkgs(
pkg_set.warn_no_lib_packages_and_artifact_libs_overlapping_deps(
ws,
&resolved_with_overrides,
&member_ids,
has_dev_units,
requested_targets,
target_data,
force_all_targets,
);
for (pkg_id, dep_pkgs) in no_lib_pkgs {
for dep_pkg in dep_pkgs {
ws.config().shell().warn(&format!(
"{} ignoring invalid dependency `{}` which is missing a lib target",
pkg_id,
dep_pkg.name(),
))?;
}
}
)?;
Ok(WorkspaceResolve {
pkg_set,

View File

@ -301,7 +301,8 @@ fn add_pkg(
let node_features = resolved_features.activated_features(package_id, features_for);
let node_kind = match features_for {
FeaturesFor::HostDep => CompileKind::Host,
FeaturesFor::NormalOrDev => requested_kind,
FeaturesFor::NormalOrDevOrArtifactTarget(Some(target)) => CompileKind::Target(target),
FeaturesFor::NormalOrDevOrArtifactTarget(None) => requested_kind,
};
let node = Node::Package {
package_id,

View File

@ -150,16 +150,12 @@ impl<N: Hash + Eq + Clone, E: Eq + Hash + Clone, V> DependencyQueue<N, E, V> {
/// A package is ready to be built when it has 0 un-built dependencies. If
/// `None` is returned then no packages are ready to be built.
pub fn dequeue(&mut self) -> Option<(N, V)> {
let next = self
let key = self
.dep_map
.iter()
.filter(|(_, (deps, _))| deps.is_empty())
.map(|(key, _)| key.clone())
.max_by_key(|k| self.priority[k]);
let key = match next {
Some(key) => key,
None => return None,
};
.max_by_key(|k| self.priority[k])?;
let (_, data) = self.dep_map.remove(&key).unwrap();
Some((key, data))
}

View File

@ -17,7 +17,7 @@ use toml_edit::easy as toml;
use url::Url;
use crate::core::compiler::{CompileKind, CompileTarget};
use crate::core::dependency::DepKind;
use crate::core::dependency::{Artifact, ArtifactTarget, DepKind};
use crate::core::manifest::{ManifestMetadata, TargetSourcePath, Warnings};
use crate::core::resolver::ResolveBehavior;
use crate::core::{Dependency, Manifest, PackageId, Summary, Target};
@ -277,6 +277,13 @@ pub struct DetailedTomlDependency<P = String> {
default_features2: Option<bool>,
package: Option<String>,
public: Option<bool>,
/// One ore more of 'bin', 'cdylib', 'staticlib', 'bin:<name>'.
artifact: Option<StringOrVec>,
/// If set, the artifact should also be a dependency
lib: Option<bool>,
/// A platform name, like `x86_64-apple-darwin`
target: Option<String>,
}
// Explicit implementation so we avoid pulling in P: Default
@ -297,6 +304,9 @@ impl<P> Default for DetailedTomlDependency<P> {
default_features2: Default::default(),
package: Default::default(),
public: Default::default(),
artifact: Default::default(),
lib: Default::default(),
target: Default::default(),
}
}
}
@ -1950,6 +1960,41 @@ impl<P: ResolveToPath> DetailedTomlDependency<P> {
dep.set_public(p);
}
if let (Some(artifact), is_lib, target) = (
self.artifact.as_ref(),
self.lib.unwrap_or(false),
self.target.as_deref(),
) {
if cx.config.cli_unstable().bindeps {
let artifact = Artifact::parse(artifact, is_lib, target)?;
if dep.kind() != DepKind::Build
&& artifact.target() == Some(ArtifactTarget::BuildDependencyAssumeTarget)
{
bail!(
r#"`target = "target"` in normal- or dev-dependencies has no effect ({})"#,
name_in_toml
);
}
dep.set_artifact(artifact)
} else {
bail!("`artifact = …` requires `-Z bindeps` ({})", name_in_toml);
}
} else if self.lib.is_some() || self.target.is_some() {
for (is_set, specifier) in [
(self.lib.is_some(), "lib"),
(self.target.is_some(), "target"),
] {
if !is_set {
continue;
}
bail!(
"'{}' specifier cannot be used without an 'artifact = …' value ({})",
specifier,
name_in_toml
)
}
}
Ok(dep)
}
}

View File

@ -88,6 +88,7 @@ Each new feature described below should explain how to use it.
* [Profile `strip` option](#profile-strip-option) — Forces the removal of debug information and symbols from executables.
* [Profile `rustflags` option](#profile-rustflags-option) — Passed directly to rustc.
* [per-package-target](#per-package-target) — Sets the `--target` to use for each individual package.
* [artifact dependencies](#artifact-dependencies) - Allow build artifacts to be included into other build artifacts and build them for different targets.
* Information and metadata
* [Build-plan](#build-plan) — Emits JSON information on which commands will be run.
* [unit-graph](#unit-graph) — Emits JSON for Cargo's internal graph structure.
@ -812,6 +813,54 @@ In this example, the crate is always built for
as a plugin for a main program that runs on the host (or provided on
the command line) target.
### artifact-dependencies
* Tracking Issue: [#9096](https://github.com/rust-lang/cargo/pull/9096)
* Original Pull Request: [#9992](https://github.com/rust-lang/cargo/pull/9992)
Allow Cargo packages to depend on `bin`, `cdylib`, and `staticlib` crates,
and use the artifacts built by those crates at compile time.
Run `cargo` with `-Z bindeps` to enable this functionality.
**Example:** use _cdylib_ artifact in build script
The `Cargo.toml` in the consuming package, building the `bar` library as `cdylib`
for a specific build target…
```toml
[build-dependencies]
bar = { artifact = "cdylib", version = "1.0", target = "wasm32-unknown-unknown" }
```
…along with the build script in `build.rs`.
```rust
fn main() {
wasm::run_file(env!("CARGO_CDYLIB_FILE_BAR"));
}
```
**Example:** use _binary_ artifact and its library in a binary
The `Cargo.toml` in the consuming package, building the `bar` binary for inclusion
as artifact while making it available as library as well…
```toml
[dependencies]
bar = { artifact = "bin", version = "1.0", lib = true }
```
…along with the executable using `main.rs`.
```rust
fn main() {
bar::init();
command::run(env!("CARGO_BIN_FILE_BAR"));
}
```
### credential-process
* Tracking Issue: [#8933](https://github.com/rust-lang/cargo/issues/8933)
* RFC: [#2730](https://github.com/rust-lang/rfcs/pull/2730)

File diff suppressed because it is too large Load Diff

View File

@ -10,6 +10,7 @@ extern crate cargo_test_macro;
mod advanced_env;
mod alt_registry;
mod artifact_dep;
mod bad_config;
mod bad_manifest_path;
mod bench;

View File

@ -925,8 +925,9 @@ fn workspace_metadata() {
}
#[cargo_test]
fn workspace_metadata_no_deps() {
fn workspace_metadata_with_dependencies_no_deps() {
let p = project()
// NOTE that 'artifact' isn't mentioned in the workspace here, yet it shows up as member.
.file(
"Cargo.toml",
r#"
@ -934,13 +935,29 @@ fn workspace_metadata_no_deps() {
members = ["bar", "baz"]
"#,
)
.file("bar/Cargo.toml", &basic_lib_manifest("bar"))
.file(
"bar/Cargo.toml",
r#"
[package]
name = "bar"
version = "0.5.0"
authors = ["wycats@example.com"]
[dependencies]
baz = { path = "../baz/" }
artifact = { path = "../artifact/", artifact = "bin" }
"#,
)
.file("bar/src/lib.rs", "")
.file("baz/Cargo.toml", &basic_lib_manifest("baz"))
.file("baz/src/lib.rs", "")
.file("artifact/Cargo.toml", &basic_bin_manifest("artifact"))
.file("artifact/src/main.rs", "fn main() {}")
.build();
p.cargo("metadata --no-deps")
p.cargo("metadata --no-deps -Z bindeps")
.masquerade_as_nightly_cargo()
.with_json(
r#"
{
@ -961,8 +978,42 @@ fn workspace_metadata_no_deps() {
"id": "bar[..]",
"keywords": [],
"source": null,
"dependencies": [],
"license": null,
"dependencies": [
{
"features": [],
"kind": null,
"name": "artifact",
"optional": false,
"path": "[..]/foo/artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true,
"artifact": {
"kinds": [
"bin"
],
"lib": false,
"target": null
}
},
{
"features": [],
"kind": null,
"name": "baz",
"optional": false,
"path": "[..]/foo/baz",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
}
],
"license_file": null,
"links": null,
"description": null,
@ -984,6 +1035,49 @@ fn workspace_metadata_no_deps() {
"metadata": null,
"publish": null
},
{
"authors": [
"wycats@example.com"
],
"categories": [],
"default_run": null,
"dependencies": [],
"description": null,
"documentation": null,
"edition": "2015",
"features": {},
"homepage": null,
"id": "artifact 0.5.0 (path+file:[..]/foo/artifact)",
"keywords": [],
"license": null,
"license_file": null,
"links": null,
"manifest_path": "[..]/foo/artifact/Cargo.toml",
"metadata": null,
"name": "artifact",
"publish": null,
"readme": null,
"repository": null,
"rust_version": null,
"source": null,
"targets": [
{
"crate_types": [
"bin"
],
"doc": true,
"doctest": false,
"edition": "2015",
"kind": [
"bin"
],
"name": "artifact",
"src_path": "[..]/foo/artifact/src/main.rs",
"test": true
}
],
"version": "0.5.0"
},
{
"authors": [
"wycats@example.com"
@ -1024,7 +1118,11 @@ fn workspace_metadata_no_deps() {
"publish": null
}
],
"workspace_members": ["bar 0.5.0 (path+file:[..]bar)", "baz 0.5.0 (path+file:[..]baz)"],
"workspace_members": [
"bar 0.5.0 (path+file:[..]bar)",
"artifact 0.5.0 (path+file:[..]/foo/artifact)",
"baz 0.5.0 (path+file:[..]baz)"
],
"resolve": null,
"target_directory": "[..]foo/target",
"version": 1,
@ -1035,6 +1133,563 @@ fn workspace_metadata_no_deps() {
.run();
}
#[cargo_test]
fn workspace_metadata_with_dependencies_and_resolve() {
let alt_target = "wasm32-unknown-unknown";
let p = project()
.file(
"Cargo.toml",
r#"
[workspace]
members = ["bar", "artifact", "non-artifact", "bin-only-artifact"]
"#,
)
.file(
"bar/Cargo.toml",
&r#"
[package]
name = "bar"
version = "0.5.0"
authors = []
[build-dependencies]
artifact = { path = "../artifact/", artifact = "bin", target = "target" }
bin-only-artifact = { path = "../bin-only-artifact/", artifact = "bin", target = "$ALT_TARGET" }
non-artifact = { path = "../non-artifact" }
[dependencies]
artifact = { path = "../artifact/", artifact = ["cdylib", "staticlib", "bin:baz-name"], lib = true, target = "$ALT_TARGET" }
bin-only-artifact = { path = "../bin-only-artifact/", artifact = "bin:a-name" }
non-artifact = { path = "../non-artifact" }
[dev-dependencies]
artifact = { path = "../artifact/" }
non-artifact = { path = "../non-artifact" }
bin-only-artifact = { path = "../bin-only-artifact/", artifact = "bin:b-name" }
"#.replace("$ALT_TARGET", alt_target),
)
.file("bar/src/lib.rs", "")
.file("bar/build.rs", "fn main() {}")
.file(
"artifact/Cargo.toml",
r#"
[package]
name = "artifact"
version = "0.5.0"
authors = []
[lib]
crate-type = ["staticlib", "cdylib", "rlib"]
[[bin]]
name = "bar-name"
[[bin]]
name = "baz-name"
"#,
)
.file("artifact/src/main.rs", "fn main() {}")
.file("artifact/src/lib.rs", "")
.file(
"bin-only-artifact/Cargo.toml",
r#"
[package]
name = "bin-only-artifact"
version = "0.5.0"
authors = []
[[bin]]
name = "a-name"
[[bin]]
name = "b-name"
"#,
)
.file("bin-only-artifact/src/main.rs", "fn main() {}")
.file("non-artifact/Cargo.toml",
r#"
[package]
name = "non-artifact"
version = "0.5.0"
authors = []
"#,
)
.file("non-artifact/src/lib.rs", "")
.build();
p.cargo("metadata -Z bindeps")
.masquerade_as_nightly_cargo()
.with_json(
r#"
{
"metadata": null,
"packages": [
{
"authors": [],
"categories": [],
"default_run": null,
"dependencies": [],
"description": null,
"documentation": null,
"edition": "2015",
"features": {},
"homepage": null,
"id": "artifact 0.5.0 (path+file://[..]/foo/artifact)",
"keywords": [],
"license": null,
"license_file": null,
"links": null,
"manifest_path": "[..]/foo/artifact/Cargo.toml",
"metadata": null,
"name": "artifact",
"publish": null,
"readme": null,
"repository": null,
"rust_version": null,
"source": null,
"targets": [
{
"crate_types": [
"staticlib",
"cdylib",
"rlib"
],
"doc": true,
"doctest": true,
"edition": "2015",
"kind": [
"staticlib",
"cdylib",
"rlib"
],
"name": "artifact",
"src_path": "[..]/foo/artifact/src/lib.rs",
"test": true
},
{
"crate_types": [
"bin"
],
"doc": true,
"doctest": false,
"edition": "2015",
"kind": [
"bin"
],
"name": "bar-name",
"src_path": "[..]/foo/artifact/src/main.rs",
"test": true
},
{
"crate_types": [
"bin"
],
"doc": true,
"doctest": false,
"edition": "2015",
"kind": [
"bin"
],
"name": "baz-name",
"src_path": "[..]/foo/artifact/src/main.rs",
"test": true
}
],
"version": "0.5.0"
},
{
"authors": [],
"categories": [],
"default_run": null,
"dependencies": [
{
"artifact": {
"kinds": [
"cdylib",
"staticlib",
"bin:baz-name"
],
"lib": true,
"target": "wasm32-unknown-unknown"
},
"features": [],
"kind": null,
"name": "artifact",
"optional": false,
"path": "[..]/foo/artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"artifact": {
"kinds": [
"bin:a-name"
],
"lib": false,
"target": null
},
"features": [],
"kind": null,
"name": "bin-only-artifact",
"optional": false,
"path": "[..]/foo/bin-only-artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"features": [],
"kind": null,
"name": "non-artifact",
"optional": false,
"path": "[..]/foo/non-artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"features": [],
"kind": "dev",
"name": "artifact",
"optional": false,
"path": "[..]/foo/artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"artifact": {
"kinds": [
"bin:b-name"
],
"lib": false,
"target": null
},
"features": [],
"kind": "dev",
"name": "bin-only-artifact",
"optional": false,
"path": "[..]/foo/bin-only-artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"features": [],
"kind": "dev",
"name": "non-artifact",
"optional": false,
"path": "[..]/foo/non-artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"artifact": {
"kinds": [
"bin"
],
"lib": false,
"target": "target"
},
"features": [],
"kind": "build",
"name": "artifact",
"optional": false,
"path": "[..]/foo/artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"artifact": {
"kinds": [
"bin"
],
"lib": false,
"target": "wasm32-unknown-unknown"
},
"features": [],
"kind": "build",
"name": "bin-only-artifact",
"optional": false,
"path": "[..]/foo/bin-only-artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"features": [],
"kind": "build",
"name": "non-artifact",
"optional": false,
"path": "[..]/foo/non-artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
}
],
"description": null,
"documentation": null,
"edition": "2015",
"features": {},
"homepage": null,
"id": "bar 0.5.0 (path+file://[..]/foo/bar)",
"keywords": [],
"license": null,
"license_file": null,
"links": null,
"manifest_path": "[..]/foo/bar/Cargo.toml",
"metadata": null,
"name": "bar",
"publish": null,
"readme": null,
"repository": null,
"rust_version": null,
"source": null,
"targets": [
{
"crate_types": [
"lib"
],
"doc": true,
"doctest": true,
"edition": "2015",
"kind": [
"lib"
],
"name": "bar",
"src_path": "[..]/foo/bar/src/lib.rs",
"test": true
},
{
"crate_types": [
"bin"
],
"doc": false,
"doctest": false,
"edition": "2015",
"kind": [
"custom-build"
],
"name": "build-script-build",
"src_path": "[..]/foo/bar/build.rs",
"test": false
}
],
"version": "0.5.0"
},
{
"authors": [],
"categories": [],
"default_run": null,
"dependencies": [],
"description": null,
"documentation": null,
"edition": "2015",
"features": {},
"homepage": null,
"id": "bin-only-artifact 0.5.0 (path+file://[..]/foo/bin-only-artifact)",
"keywords": [],
"license": null,
"license_file": null,
"links": null,
"manifest_path": "[..]/foo/bin-only-artifact/Cargo.toml",
"metadata": null,
"name": "bin-only-artifact",
"publish": null,
"readme": null,
"repository": null,
"rust_version": null,
"source": null,
"targets": [
{
"crate_types": [
"bin"
],
"doc": true,
"doctest": false,
"edition": "2015",
"kind": [
"bin"
],
"name": "a-name",
"src_path": "[..]/foo/bin-only-artifact/src/main.rs",
"test": true
},
{
"crate_types": [
"bin"
],
"doc": true,
"doctest": false,
"edition": "2015",
"kind": [
"bin"
],
"name": "b-name",
"src_path": "[..]/foo/bin-only-artifact/src/main.rs",
"test": true
}
],
"version": "0.5.0"
},
{
"authors": [],
"categories": [],
"default_run": null,
"dependencies": [],
"description": null,
"documentation": null,
"edition": "2015",
"features": {},
"homepage": null,
"id": "non-artifact 0.5.0 (path+file://[..]/foo/non-artifact)",
"keywords": [],
"license": null,
"license_file": null,
"links": null,
"manifest_path": "[..]/foo/non-artifact/Cargo.toml",
"metadata": null,
"name": "non-artifact",
"publish": null,
"readme": null,
"repository": null,
"rust_version": null,
"source": null,
"targets": [
{
"crate_types": [
"lib"
],
"doc": true,
"doctest": true,
"edition": "2015",
"kind": [
"lib"
],
"name": "non-artifact",
"src_path": "[..]/foo/non-artifact/src/lib.rs",
"test": true
}
],
"version": "0.5.0"
}
],
"resolve": {
"nodes": [
{
"dependencies": [],
"deps": [],
"features": [],
"id": "artifact 0.5.0 (path+file://[..]/foo/artifact)"
},
{
"dependencies": [
"artifact 0.5.0 (path+file://[..]/foo/artifact)",
"non-artifact 0.5.0 (path+file://[..]/foo/non-artifact)"
],
"deps": [
{
"dep_kinds": [
{
"kind": null,
"target": null
},
{
"kind": "dev",
"target": null
},
{
"kind": "build",
"target": null
}
],
"name": "artifact",
"pkg": "artifact 0.5.0 (path+file://[..]/foo/artifact)"
},
{
"dep_kinds": [
{
"kind": null,
"target": null
},
{
"kind": "dev",
"target": null
},
{
"kind": "build",
"target": null
}
],
"name": "non_artifact",
"pkg": "non-artifact 0.5.0 (path+file://[..]/foo/non-artifact)"
}
],
"features": [],
"id": "bar 0.5.0 (path+file://[..]/foo/bar)"
},
{
"dependencies": [],
"deps": [],
"features": [],
"id": "bin-only-artifact 0.5.0 (path+file://[..]/foo/bin-only-artifact)"
},
{
"dependencies": [],
"deps": [],
"features": [],
"id": "non-artifact 0.5.0 (path+file://[..]/foo/non-artifact)"
}
],
"root": null
},
"target_directory": "[..]/foo/target",
"version": 1,
"workspace_members": [
"bar 0.5.0 (path+file://[..]/foo/bar)",
"artifact 0.5.0 (path+file://[..]/foo/artifact)",
"bin-only-artifact 0.5.0 (path+file://[..]/foo/bin-only-artifact)",
"non-artifact 0.5.0 (path+file://[..]/foo/non-artifact)"
],
"workspace_root": "[..]/foo"
}
"#,
)
.run();
}
#[cargo_test]
fn cargo_metadata_with_invalid_manifest() {
let p = project().file("Cargo.toml", "").build();
@ -3095,3 +3750,236 @@ fn cargo_metadata_non_utf8() {
.with_status(101)
.run();
}
// TODO: Consider using this test instead of the version without the 'artifact' suffix or merge them because they should be pretty much the same.
#[cargo_test]
fn workspace_metadata_with_dependencies_no_deps_artifact() {
let p = project()
// NOTE that 'artifact' isn't mentioned in the workspace here, yet it shows up as member.
.file(
"Cargo.toml",
r#"
[workspace]
members = ["bar", "baz"]
"#,
)
.file(
"bar/Cargo.toml",
r#"
[package]
name = "bar"
version = "0.5.0"
authors = ["wycats@example.com"]
[dependencies]
baz = { path = "../baz/" }
baz-renamed = { path = "../baz/" }
artifact = { path = "../artifact/", artifact = "bin" }
"#,
)
.file("bar/src/lib.rs", "")
.file("baz/Cargo.toml", &basic_lib_manifest("baz"))
.file("baz/src/lib.rs", "")
.file("artifact/Cargo.toml", &basic_bin_manifest("artifact"))
.file("artifact/src/main.rs", "fn main() {}")
.build();
p.cargo("metadata --no-deps -Z bindeps")
.masquerade_as_nightly_cargo()
.with_json(
r#"
{
"metadata": null,
"packages": [
{
"authors": [
"wycats@example.com"
],
"categories": [],
"default_run": null,
"dependencies": [
{
"artifact": {
"kinds": [
"bin"
],
"lib": false,
"target": null
},
"features": [],
"kind": null,
"name": "artifact",
"optional": false,
"path": "[..]/foo/artifact",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"features": [],
"kind": null,
"name": "baz",
"optional": false,
"path": "[..]/foo/baz",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
},
{
"features": [],
"kind": null,
"name": "baz-renamed",
"optional": false,
"path": "[..]/foo/baz",
"registry": null,
"rename": null,
"req": "*",
"source": null,
"target": null,
"uses_default_features": true
}
],
"description": null,
"documentation": null,
"edition": "2015",
"features": {},
"homepage": null,
"id": "bar 0.5.0 (path+file://[..]/foo/bar)",
"keywords": [],
"license": null,
"license_file": null,
"links": null,
"manifest_path": "[..]/foo/bar/Cargo.toml",
"metadata": null,
"name": "bar",
"publish": null,
"readme": null,
"repository": null,
"rust_version": null,
"source": null,
"targets": [
{
"crate_types": [
"lib"
],
"doc": true,
"doctest": true,
"edition": "2015",
"kind": [
"lib"
],
"name": "bar",
"src_path": "[..]/foo/bar/src/lib.rs",
"test": true
}
],
"version": "0.5.0"
},
{
"authors": [
"wycats@example.com"
],
"categories": [],
"default_run": null,
"dependencies": [],
"description": null,
"documentation": null,
"edition": "2015",
"features": {},
"homepage": null,
"id": "artifact 0.5.0 (path+file://[..]/foo/artifact)",
"keywords": [],
"license": null,
"license_file": null,
"links": null,
"manifest_path": "[..]/foo/artifact/Cargo.toml",
"metadata": null,
"name": "artifact",
"publish": null,
"readme": null,
"repository": null,
"rust_version": null,
"source": null,
"targets": [
{
"crate_types": [
"bin"
],
"doc": true,
"doctest": false,
"edition": "2015",
"kind": [
"bin"
],
"name": "artifact",
"src_path": "[..]/foo/artifact/src/main.rs",
"test": true
}
],
"version": "0.5.0"
},
{
"authors": [
"wycats@example.com"
],
"categories": [],
"default_run": null,
"dependencies": [],
"description": null,
"documentation": null,
"edition": "2015",
"features": {},
"homepage": null,
"id": "baz 0.5.0 (path+file://[..]/foo/baz)",
"keywords": [],
"license": null,
"license_file": null,
"links": null,
"manifest_path": "[..]/foo/baz/Cargo.toml",
"metadata": null,
"name": "baz",
"publish": null,
"readme": null,
"repository": null,
"rust_version": null,
"source": null,
"targets": [
{
"crate_types": [
"lib"
],
"doc": true,
"doctest": true,
"edition": "2015",
"kind": [
"lib"
],
"name": "baz",
"src_path": "[..]/foo/baz/src/lib.rs",
"test": true
}
],
"version": "0.5.0"
}
],
"resolve": null,
"target_directory": "[..]/foo/target",
"version": 1,
"workspace_members": [
"bar 0.5.0 (path+file://[..]/foo/bar)",
"artifact 0.5.0 (path+file://[..]/foo/artifact)",
"baz 0.5.0 (path+file://[..]/foo/baz)"
],
"workspace_root": "[..]/foo"
}
"#,
)
.run();
}

View File

@ -372,7 +372,7 @@ fn named_config_profile() {
// normal package
let mode = CompileMode::Build;
let kind = CompileKind::Host;
let p = profiles.get_profile(a_pkg, true, true, UnitFor::new_normal(), mode, kind);
let p = profiles.get_profile(a_pkg, true, true, UnitFor::new_normal(kind), mode, kind);
assert_eq!(p.name, "foo");
assert_eq!(p.codegen_units, Some(2)); // "foo" from config
assert_eq!(p.opt_level, "1"); // "middle" from manifest
@ -381,7 +381,14 @@ fn named_config_profile() {
assert_eq!(p.overflow_checks, true); // "dev" built-in (ignore package override)
// build-override
let bo = profiles.get_profile(a_pkg, true, true, UnitFor::new_host(false), mode, kind);
let bo = profiles.get_profile(
a_pkg,
true,
true,
UnitFor::new_host(false, kind),
mode,
kind,
);
assert_eq!(bo.name, "foo");
assert_eq!(bo.codegen_units, Some(6)); // "foo" build override from config
assert_eq!(bo.opt_level, "0"); // default to zero
@ -390,7 +397,7 @@ fn named_config_profile() {
assert_eq!(bo.overflow_checks, true); // SAME as normal
// package overrides
let po = profiles.get_profile(dep_pkg, false, true, UnitFor::new_normal(), mode, kind);
let po = profiles.get_profile(dep_pkg, false, true, UnitFor::new_normal(kind), mode, kind);
assert_eq!(po.name, "foo");
assert_eq!(po.codegen_units, Some(7)); // "foo" package override from config
assert_eq!(po.opt_level, "1"); // SAME as normal