Fairness-in-AI icon indicating copy to clipboard operation
Fairness-in-AI copied to clipboard

Not able to get prediction for the text below:

Open rajneesh407 opened this issue 1 year ago • 0 comments

'1. Field of the Invention\nThis invention relates to the preparation of amides of fatty acids having from about 8 to about 20 carbon atoms. In particular, the invention relates to a process for producing the foregoing amides from esters by reaction thereof with amine reactant in an anhydrous system.\n2. Description of the Prior Art\nThe preparation of fatty acid amides from fatty acid esters can be accomplished by several processes known in the prior art. In the process of U.S. Pat. No. 3,253,006, the reaction is performed in the presence of what is described as a highly critical amount of water and under high pressures of above 1000 psig. Unfortunately the high pressures used necessitate expensive equipment capable of withstanding high pressure operation and the presence of water produces a very corrosive system requiring special materials of construction.\nIn other prior art the use of solvents other than water is disclosed. For example, U.S. Pat. No. 2,464,094 discloses the use of alcohol solvents such as methanol fed to the reaction system. Although the patent does discuss the subsequent removal of methanol, it does not suggest removal to the extent or in the manner disclosed herein. The problem of slow reaction rate in amidation of esters is evident in the prior art search for catalysts as disclosed, for example, in the main force of U.S. Pat. No. 2,464,094. Another patent dealing with solvents deliberately or fortuitously present is U.S. Pat. No. 2,504,427. Although this patent speaks of distilling off the by-product water or alcohol or using complexing agents, such is not undertaken until after the reaction is terminated.\nIn some instances, the use of catalysts such as salts or alkali metals is regarded as very much undesired. Not only is this an item of expense but also there is the problem of removal of the catalyst after its presence is no longer desired. A process that can be enhanced with catalyst yet which can be performed satisfactorily without catalyst can be useful in various ways.\nOther prior art includes processes in which operation is at low pressures and in the absence of water; however, as discussed in the aforementioned U.S. Pat. No. 3,253,006, the prior art operations under anhydrous conditions have been characteristically slow requiring reaction times of as much as several days. Such long reaction times are undesired for obvious reasons because of the adverse effect thereof upon the ability to produce amides at low cost.\nIt is accordingly an object of the present invention to provide a process for producing amides which does not require either high pressure of operation or catalysts.\nAnother object of the present invention is to provide a process for producing amides using anhydrous conditions. Another object of the present invention is to provide a process for producing amides that does not require solvents.\nAnother object of the present invention is to provide a process for producing amides by reaction of ester and amine reactant wherein high reaction rate is obtained in an anhydrous system at low pressure and in which amine reactant is used as a stripping agent to remove reaction by-products.\nAnother object of the present invention is to provide a process for producing amides from esters of fatty acids and an amine reactant wherein alcohol liberated from the esters in the course of the reaction is removed from the system by stripping with excess amine reactant.'

I am trying to get the bias score for the above text and library is breaking or not able to give the desired results.

`from transformers import AutoTokenizer, TFAutoModelForSequenceClassification from transformers import pipeline tokenizer = AutoTokenizer.from_pretrained("d4data/bias-detection-model") model = TFAutoModelForSequenceClassification.from_pretrained("d4data/bias-detection-model") Some layers from the model checkpoint at d4data/bias-detection-model were not used when initializing TFDistilBertForSequenceClassification: ['dropout_19']

  • This IS expected if you are initializing TFDistilBertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
  • This IS NOT expected if you are initializing TFDistilBertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model). Some layers of TFDistilBertForSequenceClassification were not initialized from the model checkpoint at d4data/bias-detection-model and are newly initialized: ['dropout_59'] You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

classifier = pipeline('text-classification', model=model, tokenizer=tokenizer) # cuda = 0,1 based on gpu availability classifier(data['text'][1181]) Traceback (most recent call last):

File "", line 2, in classifier(data['text'][1181])

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\pipelines\text_classification.py", line 65, in call outputs = super().call(*args, **kwargs)

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\pipelines\base.py", line 676, in call return self._forward(inputs)

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\pipelines\base.py", line 693, in _forward predictions = self.model(inputs.data, training=False)[0]

File "C:\Users\Rajneesh Jha\anaconda3\lib\site-packages\keras\utils\traceback_utils.py", line 70, in error_handler raise e.with_traceback(filtered_tb) from None

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\models\distilbert\modeling_tf_distilbert.py", line 800, in call distilbert_output = self.distilbert(

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\models\distilbert\modeling_tf_distilbert.py", line 415, in call embedding_output = self.embeddings(

File "C:\Users\Rajneesh Jha\AppData\Roaming\Python\Python38\site-packages\transformers\models\distilbert\modeling_tf_distilbert.py", line 119, in call position_embeds = tf.gather(params=self.position_embeddings, indices=position_ids)

InvalidArgumentError: Exception encountered when calling layer "embeddings" " f"(type TFEmbeddings).

{{function_node _wrapped__ResourceGather_device/job:localhost/replica:0/task:0/device:CPU:0}} indices[0,529] = 529 is not in [0, 512) [Op:ResourceGather]

Call arguments received by layer "embeddings" " f"(type TFEmbeddings): • input_ids=tf.Tensor(shape=(1, 732), dtype=int32) • position_ids=None • inputs_embeds=None • training=False`

rajneesh407 avatar Aug 31 '23 11:08 rajneesh407