openhgnn.models.SimpleHGN

class SimpleHGN(edge_dim, num_etypes, in_dim, hidden_dim, num_classes, num_layers, heads, feat_drop, negative_slope, residual, beta, ntypes)[源代码]

This is a model SimpleHGN from Are we really making much progress? Revisiting, benchmarking, and refining heterogeneous graph neural networks

The model extend the original graph attention mechanism in GAT by including edge type information into attention calculation.

Calculating the coefficient:

\[\alpha_{ij} = \frac{exp(LeakyReLU(a^T[Wh_i||Wh_j||W_r r_{\psi(<i,j>)}]))}{\Sigma_{k\in\mathcal{E}}{exp(LeakyReLU(a^T[Wh_i||Wh_k||W_r r_{\psi(<i,k>)}]))}} \quad (1)\]

Residual connection including Node residual:

\[h_i^{(l)} = \sigma(\Sigma_{j\in \mathcal{N}_i} {\alpha_{ij}^{(l)}W^{(l)}h_j^{(l-1)}} + h_i^{(l-1)}) \quad (2)\]

and Edge residual:

\[\alpha_{ij}^{(l)} = (1-\beta)\alpha_{ij}^{(l)}+\beta\alpha_{ij}^{(l-1)} \quad (3)\]

Multi-heads:

\[h^{(l+1)}_j = \parallel^M_{m = 1}h^{(l + 1, m)}_j \quad (4)\]

Residual:

\[h^{(l+1)}_j = h^{(l)}_j + \parallel^M_{m = 1}h^{(l + 1, m)}_j \quad (5)\]
参数:
  • edge_dim (int) – the edge dimension

  • num_etypes (int) – the number of the edge type

  • in_dim (int) – the input dimension

  • hidden_dim (int) – the output dimension

  • num_classes (int) – the number of the output classes

  • num_layers (int) – the number of layers we used in the computing

  • heads (list) – the list of the number of heads in each layer

  • feat_drop (float) – the feature drop rate

  • negative_slope (float) – the negative slope used in the LeakyReLU

  • residual (boolean) – if we need the residual operation

  • beta (float) – the hyperparameter used in edge residual

  • ntypes (list) – the list of node type