rust_hdl icon indicating copy to clipboard operation
rust_hdl copied to clipboard

Visibility rules

Open kraigher opened this issue 7 years ago • 5 comments

The next step is to work on visibility rules for use clauses and nested scopes.

Keywords: potentially visible, directly visible, implicit, explicit, declarative region, immediate scope.

I have identified the following rules.

Immediate scope always takes precedence for non-overloaded

status: Done

constant ident : natural := 0;
use work.pkg.ident;
-- local ident trumps use
constant ident2 : natural := ident;

status: Pending analysis of formal part of association

 use work.pkg;
 type rec_t is record
    const : natural;
 end record;
  constant pkg : rec_t := (const => 0);
--  constant pkg : natural := 0; Error if pkg was not record in selected name below

  -- Uses record
  constant const3 : natural := pkg.const;

status: Pending resolving overloaded names by signature

package pkg is
  procedure foo;
end package;

package body pkg is
  procedure foo is
  begin
    report "pkg foo";
  end procedure;
end package body;

entity ent is
end entity;

architecture a of ent is
  procedure foo is
  begin
    report "arch foo";
  end procedure;
begin
  main : process
    use work.pkg.foo;
  begin
    foo; -- arch foo
  end process;
end architecture;

Multiple non overloaded names require disambiguation

status: Done

use work.pkg1.ident;
use work.pkg2.ident;
-- Error requires disambiguation event if pkg1.ident and pkg2.ident are different types
-- Also true if .all was used
constant ident2 : natural := ident;

-- The following would work without ambiguity though
use work.pkg1.ident;
constant ident2 : natural := ident;
use work.pkg2.ident;

Multiple overloaded names do not require disambiguation

status: Pending resolving overloaded names by signature

package pkg is
 function fun return natural;
 function same return natural;
end package;

package pkg2 is
  function fun return boolean;
  function same return natural
end package;

use work.pkg.all;
use work.pkg2.all;
-- Works
constant ident : natural := fun;
-- Requires disambiguation
constant ident2 : natural := same;

Mixed overloaded names require disambiguation

status: Done

package pkg is
 function fun return natural;
 function same return natural;
end package;

package pkg2 is
  constant fun : boolean := false;
  constant same : natural := 0;
end package;

use work.pkg.all;
use work.pkg2.all;
-- Requires disambiguation
constant ident : natural := fun;
-- Requires disambiguation
constant ident2 : natural := same;

Enum literals are automatically visible when using type

status: Done

package pkg is
  type enum_t is (v1, v2);
end package;

use work.pkg.enum_t;
constant ident: enum_t := v1;

Enum literals are overloaded

status: Done

package pkg is
  type enum_t is (v1, v2);
end package;

package pkg is
  type enum2_t is (v1, v2);
end package;

use work.pkg.enum_t;
use work.pkg2.enum2_t;

-- works because of oveloading
constant ident: enum_t := v1;

status: Done

package pkg is
  type enum_t is (v1, v2);
end package;

package pkg is
  constant v1 := 0;
end package;

use work.pkg.enum_t;
use work.pkg2.v1;

-- Ambiguous
constant ident : enum_t := v1;

May use nested packages

status: Done

package gpkg is
  generic (constant init : natural);
  constant const : natural := init;
end package;

package pkg is
  use work.gpkg;
  package ipkg is new gpkg generic map (init => 1);
end package;

package pkg2 is
  use work.pkg.ipkg.all;
  constant ident : natural := const;
end package;

Library names are nothing special

status: Done

library lib;

package pkg is
  type rec_t is record
    pkg : natural;
  end record;

  constant work : rec_t := (pkg => 0);
  constant foo : natural := work.pkg;

  constant lib : rec_t := (pkg => 0);
  constant foo : natural := lib.pkg;

  -- Should not work
  use lib.pkg.all;
end package;

Enum literals must be handled specially when they are implicitly visible

status: Requires type inferences to make names unambigous. Currently no error

package pkg is
  type enum_t is (v1, v2);
  type enum2_t is (v2, v1);
end package;

use work.pkg.enum_t;
-- Ambiguous if not commented out
-- use work.pkg.enum2_t;

package pkg2 is
  -- A use of an enum type should not bring make all v1 names visible only the one from the specific typ. 
  constant const : string := v1'instance_name;
end package;

Directly visible declarations always take precedence over use clause even when nested

status: Done

package pkg is
  constant const : natural := 0;
end package;

entity tb_ent is
end entity;

architecture a of tb_ent is
  constant const : natural := 1;
begin
  main : process
    use work.pkg.const;
  begin
    assert const = 1; 
  end process;
end architecture;

Nested use clauses still conflict

status: Done

package pkg is
  constant const : natural := 0;
end package;

package pkg2 is
  constant const : natural := 2;
end package;

entity tb_ent is
end entity;

architecture a of tb_ent is
  use work.pkg.const;
begin
  main : process
    use work.pkg2.const;
  begin
    report integer'image(const); -- Should be an error as const is ambiguous
  end process;
end architecture;

Implicitly defined items never conflict with explicit ones

status: Pending

package pkg is
  function MINIMUM (L, R: REAL) return REAL;
end package;

package body pkg is
  function MINIMUM (L, R: REAL) return REAL is
  begin
    return L + R;
  end;
end package body;

entity ent is
end entity;

use std.standard.MINIMUM;
use work.pkg.MINIMUM;

architecture a of ent is
begin
    main : process
    begin
      report real'image(MINIMUM(1.0, 2.0)); -- Not ambiguous, prints 3.0
    end process;
end architecture;

kraigher avatar Nov 24 '18 08:11 kraigher

I'm trying to get my head around how a semantic pass might go.

  • Step through the AST of a design unit.

  • A DeclarativeRegion structure keeps track of what symbols have been declared in this region so far.

  • As we process declarations we add the new symbols to the Declarative Region.

  • As we process symbol uses we check if the symbols have already been declared in this region. If they haven't we check in parent regions recursively. If the declaration is present in this design unit we link the use of the symbol to it's declaration. If the declaration is not present we add it to a set of unresolved symbols along with the set of packages that were in scope at the time of use.

  • Step through all design units

  • Go back to each design unit and resolve the unresolved symbols. Identify circular dependencies. Link symbol uses with their declarations in appropriate packages. Identify ambiguous symbol uses.

Is this the general direction you were thinking? At the moment it looks like the semantic analysis is making sure declarations are consistent with one another but hasn't worried about symbol uses yet. Is that correct?

benreynwar avatar Dec 04 '18 04:12 benreynwar

My current thinking is:

  • Traverse the entire AST and build DeclarativeRegions. The DeclarativeRegions only live temporarily in this pass and are thrown away afterwards. Public declarative regions such as in packages which might be saved for later use.
  • Desing units are only analyzed once and will be analyzed on-demand (See 2. in #7)
  • Check for duplicate declarations as you go.
    • DONE
  • Check some rules regarding protected type declaration/body deferred constant and incomplete type when a region is complete.
    • DONE
  • Check for missing declarations in all names and use clauses.
    • Partially done for use clauses, this is the next step

The there are two options to proceed:

  1. Annotate back to the AST (by mutating it) the multiple potential references for each name and perform type checking in a separate pass.
  2. Perform type checking in the same pass and write back to the AST the single resolved reference.

Also note that I use a Test-Driven-Development workflow which means I have an idea of where I want to go but I do not go ahead and build everything I think I will need without having tests that require it. Thus the current implementation may have shortcomings compared to my final intention since I have not defined the test cases that require the final intention. This is also the reason I have been holding off adding new fields to the AST since I want tests to require it and I do not want to make premature changes to the parsed AST (which is well understood by me) with semantic information (where I am stilll building my mental model of how it will work).

My strategy is also to develop the semantic analysis such that the example project always continues to work without errors. This means I will try not to implement partial solutions that will cause false errors but rather partial solutions that produce false positives. In this way the language server will never be unusable and provide incrementally more value until it is "complete".

kraigher avatar Dec 04 '18 05:12 kraigher

OK. So when we hit a "use work.mypkg.all", we pause the analysis of the design unit, and go and parse the appropriate package. Similarly if we hit a "myblah: entity work.blah" we pause the analysis, and go and parse that entity. This allows the semantic analysis to be done in a single pass, but reduces the possibility for parallelism. Is this implemented already? I didn't see it there.

benreynwar avatar Dec 04 '18 15:12 benreynwar

The on-demand analysis is implemented in my local repository. I will most likely push it today. This method will still allow enough parallelism. You can always start analysis for many design units in parallel and use locking to avoid duplicate analysis or race conditions. Only sequential dependencies would not get analysed in parallel due to locking but for large projects where parallel speed matters the dependency graph has high fan-out so there should be plenty of non-serial dependencies.

kraigher avatar Dec 04 '18 16:12 kraigher

I pushed the first version of the on-demand analysis now.

kraigher avatar Dec 05 '18 08:12 kraigher